2025-09-27 00:00:13.006955 | Job console starting 2025-09-27 00:00:13.024470 | Updating git repos 2025-09-27 00:00:13.108112 | Cloning repos into workspace 2025-09-27 00:00:13.318140 | Restoring repo states 2025-09-27 00:00:13.336910 | Merging changes 2025-09-27 00:00:13.336946 | Checking out repos 2025-09-27 00:00:13.767101 | Preparing playbooks 2025-09-27 00:00:14.512822 | Running Ansible setup 2025-09-27 00:00:21.444550 | PRE-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/pre.yaml@main] 2025-09-27 00:00:22.671416 | 2025-09-27 00:00:22.671604 | PLAY [Base pre] 2025-09-27 00:00:22.738432 | 2025-09-27 00:00:22.738568 | TASK [Setup log path fact] 2025-09-27 00:00:22.787575 | orchestrator | ok 2025-09-27 00:00:22.834522 | 2025-09-27 00:00:22.834698 | TASK [set-zuul-log-path-fact : Set log path for a build] 2025-09-27 00:00:22.891490 | orchestrator | ok 2025-09-27 00:00:22.931124 | 2025-09-27 00:00:22.931228 | TASK [emit-job-header : Print job information] 2025-09-27 00:00:23.019659 | # Job Information 2025-09-27 00:00:23.019811 | Ansible Version: 2.16.14 2025-09-27 00:00:23.019840 | Job: testbed-deploy-in-a-nutshell-with-tempest-ubuntu-24.04 2025-09-27 00:00:23.019868 | Pipeline: periodic-midnight 2025-09-27 00:00:23.019887 | Executor: 521e9411259a 2025-09-27 00:00:23.019905 | Triggered by: https://github.com/osism/testbed 2025-09-27 00:00:23.019923 | Event ID: a0cee54aec204653a6a1458a03e9401f 2025-09-27 00:00:23.025343 | 2025-09-27 00:00:23.025429 | LOOP [emit-job-header : Print node information] 2025-09-27 00:00:23.291953 | orchestrator | ok: 2025-09-27 00:00:23.292093 | orchestrator | # Node Information 2025-09-27 00:00:23.292122 | orchestrator | Inventory Hostname: orchestrator 2025-09-27 00:00:23.292143 | orchestrator | Hostname: zuul-static-regiocloud-infra-1 2025-09-27 00:00:23.292162 | orchestrator | Username: zuul-testbed03 2025-09-27 00:00:23.292180 | orchestrator | Distro: Debian 12.12 2025-09-27 00:00:23.292199 | orchestrator | Provider: static-testbed 2025-09-27 00:00:23.292217 | orchestrator | Region: 2025-09-27 00:00:23.292234 | orchestrator | Label: testbed-orchestrator 2025-09-27 00:00:23.292251 | orchestrator | Product Name: OpenStack Nova 2025-09-27 00:00:23.292267 | orchestrator | Interface IP: 81.163.193.140 2025-09-27 00:00:23.309676 | 2025-09-27 00:00:23.309769 | TASK [log-inventory : Ensure Zuul Ansible directory exists] 2025-09-27 00:00:24.640923 | orchestrator -> localhost | changed 2025-09-27 00:00:24.648314 | 2025-09-27 00:00:24.648404 | TASK [log-inventory : Copy ansible inventory to logs dir] 2025-09-27 00:00:26.435365 | orchestrator -> localhost | changed 2025-09-27 00:00:26.452144 | 2025-09-27 00:00:26.452240 | TASK [add-build-sshkey : Check to see if ssh key was already created for this build] 2025-09-27 00:00:27.500879 | orchestrator -> localhost | ok 2025-09-27 00:00:27.506792 | 2025-09-27 00:00:27.506892 | TASK [add-build-sshkey : Create a new key in workspace based on build UUID] 2025-09-27 00:00:27.545252 | orchestrator | ok 2025-09-27 00:00:27.581931 | orchestrator | included: /var/lib/zuul/builds/ecfe00c5452b48c9945e9f444f5b6112/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/create-key-and-replace.yaml 2025-09-27 00:00:27.609959 | 2025-09-27 00:00:27.610056 | TASK [add-build-sshkey : Create Temp SSH key] 2025-09-27 00:00:33.594725 | orchestrator -> localhost | Generating public/private rsa key pair. 2025-09-27 00:00:33.595673 | orchestrator -> localhost | Your identification has been saved in /var/lib/zuul/builds/ecfe00c5452b48c9945e9f444f5b6112/work/ecfe00c5452b48c9945e9f444f5b6112_id_rsa 2025-09-27 00:00:33.595725 | orchestrator -> localhost | Your public key has been saved in /var/lib/zuul/builds/ecfe00c5452b48c9945e9f444f5b6112/work/ecfe00c5452b48c9945e9f444f5b6112_id_rsa.pub 2025-09-27 00:00:33.595751 | orchestrator -> localhost | The key fingerprint is: 2025-09-27 00:00:33.595774 | orchestrator -> localhost | SHA256:QGaa1n78GeCt4NVoDtyJVKuRN/UbrSNWlqUNxbn2nW4 zuul-build-sshkey 2025-09-27 00:00:33.595818 | orchestrator -> localhost | The key's randomart image is: 2025-09-27 00:00:33.595849 | orchestrator -> localhost | +---[RSA 3072]----+ 2025-09-27 00:00:33.595869 | orchestrator -> localhost | | + . . .oo. | 2025-09-27 00:00:33.595888 | orchestrator -> localhost | | B o o . Bo | 2025-09-27 00:00:33.595906 | orchestrator -> localhost | | + * = B o. | 2025-09-27 00:00:33.595923 | orchestrator -> localhost | | . + X B o +o | 2025-09-27 00:00:33.595941 | orchestrator -> localhost | | B S * +. .o| 2025-09-27 00:00:33.595960 | orchestrator -> localhost | | . B + + . .o| 2025-09-27 00:00:33.595978 | orchestrator -> localhost | | . o o . | 2025-09-27 00:00:33.595997 | orchestrator -> localhost | | E | 2025-09-27 00:00:33.596017 | orchestrator -> localhost | | . | 2025-09-27 00:00:33.596036 | orchestrator -> localhost | +----[SHA256]-----+ 2025-09-27 00:00:33.596091 | orchestrator -> localhost | ok: Runtime: 0:00:04.543245 2025-09-27 00:00:33.603153 | 2025-09-27 00:00:33.603240 | TASK [add-build-sshkey : Remote setup ssh keys (linux)] 2025-09-27 00:00:33.640773 | orchestrator | ok 2025-09-27 00:00:33.658985 | orchestrator | included: /var/lib/zuul/builds/ecfe00c5452b48c9945e9f444f5b6112/trusted/project_1/github.com/osism/openinfra-zuul-jobs/roles/add-build-sshkey/tasks/remote-linux.yaml 2025-09-27 00:00:33.683066 | 2025-09-27 00:00:33.685144 | TASK [add-build-sshkey : Remove previously added zuul-build-sshkey] 2025-09-27 00:00:33.728336 | orchestrator | skipping: Conditional result was False 2025-09-27 00:00:33.735157 | 2025-09-27 00:00:33.735251 | TASK [add-build-sshkey : Enable access via build key on all nodes] 2025-09-27 00:00:34.454897 | orchestrator | changed 2025-09-27 00:00:34.460478 | 2025-09-27 00:00:34.460561 | TASK [add-build-sshkey : Make sure user has a .ssh] 2025-09-27 00:00:34.758284 | orchestrator | ok 2025-09-27 00:00:34.763465 | 2025-09-27 00:00:34.763547 | TASK [add-build-sshkey : Install build private key as SSH key on all nodes] 2025-09-27 00:00:35.158773 | orchestrator | ok 2025-09-27 00:00:35.169349 | 2025-09-27 00:00:35.169447 | TASK [add-build-sshkey : Install build public key as SSH key on all nodes] 2025-09-27 00:00:35.613759 | orchestrator | ok 2025-09-27 00:00:35.618842 | 2025-09-27 00:00:35.618927 | TASK [add-build-sshkey : Remote setup ssh keys (windows)] 2025-09-27 00:00:35.651462 | orchestrator | skipping: Conditional result was False 2025-09-27 00:00:35.657087 | 2025-09-27 00:00:35.657168 | TASK [remove-zuul-sshkey : Remove master key from local agent] 2025-09-27 00:00:36.793245 | orchestrator -> localhost | changed 2025-09-27 00:00:36.810217 | 2025-09-27 00:00:36.810308 | TASK [add-build-sshkey : Add back temp key] 2025-09-27 00:00:37.780329 | orchestrator -> localhost | Identity added: /var/lib/zuul/builds/ecfe00c5452b48c9945e9f444f5b6112/work/ecfe00c5452b48c9945e9f444f5b6112_id_rsa (zuul-build-sshkey) 2025-09-27 00:00:37.780511 | orchestrator -> localhost | ok: Runtime: 0:00:00.023910 2025-09-27 00:00:37.786178 | 2025-09-27 00:00:37.786261 | TASK [add-build-sshkey : Verify we can still SSH to all nodes] 2025-09-27 00:00:38.623542 | orchestrator | ok 2025-09-27 00:00:38.633044 | 2025-09-27 00:00:38.633130 | TASK [add-build-sshkey : Verify we can still SSH to all nodes (windows)] 2025-09-27 00:00:38.677249 | orchestrator | skipping: Conditional result was False 2025-09-27 00:00:38.789873 | 2025-09-27 00:00:38.789969 | TASK [start-zuul-console : Start zuul_console daemon.] 2025-09-27 00:00:39.499999 | orchestrator | ok 2025-09-27 00:00:39.521042 | 2025-09-27 00:00:39.521138 | TASK [validate-host : Define zuul_info_dir fact] 2025-09-27 00:00:39.568285 | orchestrator | ok 2025-09-27 00:00:39.580692 | 2025-09-27 00:00:39.580807 | TASK [validate-host : Ensure Zuul Ansible directory exists] 2025-09-27 00:00:40.472355 | orchestrator -> localhost | ok 2025-09-27 00:00:40.478373 | 2025-09-27 00:00:40.478451 | TASK [validate-host : Collect information about the host] 2025-09-27 00:00:41.656590 | orchestrator | ok 2025-09-27 00:00:41.670723 | 2025-09-27 00:00:41.670827 | TASK [validate-host : Sanitize hostname] 2025-09-27 00:00:41.788771 | orchestrator | ok 2025-09-27 00:00:41.793087 | 2025-09-27 00:00:41.793160 | TASK [validate-host : Write out all ansible variables/facts known for each host] 2025-09-27 00:00:42.696836 | orchestrator -> localhost | changed 2025-09-27 00:00:42.701959 | 2025-09-27 00:00:42.702049 | TASK [validate-host : Collect information about zuul worker] 2025-09-27 00:00:43.081379 | orchestrator | ok 2025-09-27 00:00:43.085569 | 2025-09-27 00:00:43.085647 | TASK [validate-host : Write out all zuul information for each host] 2025-09-27 00:00:44.279824 | orchestrator -> localhost | changed 2025-09-27 00:00:44.289516 | 2025-09-27 00:00:44.289604 | TASK [prepare-workspace-log : Start zuul_console daemon.] 2025-09-27 00:00:44.605436 | orchestrator | ok 2025-09-27 00:00:44.613982 | 2025-09-27 00:00:44.614066 | TASK [prepare-workspace-log : Synchronize src repos to workspace directory.] 2025-09-27 00:01:23.618496 | orchestrator | changed: 2025-09-27 00:01:23.618897 | orchestrator | .d..t...... src/ 2025-09-27 00:01:23.618953 | orchestrator | .d..t...... src/github.com/ 2025-09-27 00:01:23.618980 | orchestrator | .d..t...... src/github.com/osism/ 2025-09-27 00:01:23.619003 | orchestrator | .d..t...... src/github.com/osism/ansible-collection-commons/ 2025-09-27 00:01:23.619023 | orchestrator | RedHat.yml 2025-09-27 00:01:23.646077 | orchestrator | .L..t...... src/github.com/osism/ansible-collection-commons/roles/repository/tasks/CentOS.yml -> RedHat.yml 2025-09-27 00:01:23.646096 | orchestrator | RedHat.yml 2025-09-27 00:01:23.646148 | orchestrator | = 1.53.0"... 2025-09-27 00:01:35.738703 | orchestrator | 00:01:35.738 STDOUT terraform: - Finding hashicorp/local versions matching ">= 2.2.0"... 2025-09-27 00:01:35.895975 | orchestrator | 00:01:35.895 STDOUT terraform: - Installing hashicorp/null v3.2.4... 2025-09-27 00:01:36.467113 | orchestrator | 00:01:36.466 STDOUT terraform: - Installed hashicorp/null v3.2.4 (signed, key ID 0C0AF313E5FD9F80) 2025-09-27 00:01:36.541677 | orchestrator | 00:01:36.541 STDOUT terraform: - Installing terraform-provider-openstack/openstack v3.3.2... 2025-09-27 00:01:37.202266 | orchestrator | 00:01:37.202 STDOUT terraform: - Installed terraform-provider-openstack/openstack v3.3.2 (signed, key ID 4F80527A391BEFD2) 2025-09-27 00:01:37.272948 | orchestrator | 00:01:37.272 STDOUT terraform: - Installing hashicorp/local v2.5.3... 2025-09-27 00:01:37.930226 | orchestrator | 00:01:37.929 STDOUT terraform: - Installed hashicorp/local v2.5.3 (signed, key ID 0C0AF313E5FD9F80) 2025-09-27 00:01:37.930296 | orchestrator | 00:01:37.930 STDOUT terraform: Providers are signed by their developers. 2025-09-27 00:01:37.930383 | orchestrator | 00:01:37.930 STDOUT terraform: If you'd like to know more about provider signing, you can read about it here: 2025-09-27 00:01:37.930504 | orchestrator | 00:01:37.930 STDOUT terraform: https://opentofu.org/docs/cli/plugins/signing/ 2025-09-27 00:01:37.930654 | orchestrator | 00:01:37.930 STDOUT terraform: OpenTofu has created a lock file .terraform.lock.hcl to record the provider 2025-09-27 00:01:37.930808 | orchestrator | 00:01:37.930 STDOUT terraform: selections it made above. Include this file in your version control repository 2025-09-27 00:01:37.930955 | orchestrator | 00:01:37.930 STDOUT terraform: so that OpenTofu can guarantee to make the same selections by default when 2025-09-27 00:01:37.931034 | orchestrator | 00:01:37.930 STDOUT terraform: you run "tofu init" in the future. 2025-09-27 00:01:37.931118 | orchestrator | 00:01:37.931 STDOUT terraform: OpenTofu has been successfully initialized! 2025-09-27 00:01:37.931278 | orchestrator | 00:01:37.931 STDOUT terraform: You may now begin working with OpenTofu. Try running "tofu plan" to see 2025-09-27 00:01:37.931423 | orchestrator | 00:01:37.931 STDOUT terraform: any changes that are required for your infrastructure. All OpenTofu commands 2025-09-27 00:01:37.931450 | orchestrator | 00:01:37.931 STDOUT terraform: should now work. 2025-09-27 00:01:37.931580 | orchestrator | 00:01:37.931 STDOUT terraform: If you ever set or change modules or backend configuration for OpenTofu, 2025-09-27 00:01:37.931677 | orchestrator | 00:01:37.931 STDOUT terraform: rerun this command to reinitialize your working directory. If you forget, other 2025-09-27 00:01:37.931763 | orchestrator | 00:01:37.931 STDOUT terraform: commands will detect it and remind you to do so if necessary. 2025-09-27 00:01:38.023733 | orchestrator | 00:01:38.023 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed03/terraform` instead. 2025-09-27 00:01:38.023804 | orchestrator | 00:01:38.023 WARN  The `workspace` command is deprecated and will be removed in a future version of Terragrunt. Use `terragrunt run -- workspace` instead. 2025-09-27 00:01:38.217663 | orchestrator | 00:01:38.217 STDOUT terraform: Created and switched to workspace "ci"! 2025-09-27 00:01:38.217719 | orchestrator | 00:01:38.217 STDOUT terraform: You're now on a new, empty workspace. Workspaces isolate their state, 2025-09-27 00:01:38.217728 | orchestrator | 00:01:38.217 STDOUT terraform: so if you run "tofu plan" OpenTofu will not see any existing state 2025-09-27 00:01:38.217734 | orchestrator | 00:01:38.217 STDOUT terraform: for this configuration. 2025-09-27 00:01:38.358188 | orchestrator | 00:01:38.358 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed03/terraform` instead. 2025-09-27 00:01:38.358272 | orchestrator | 00:01:38.358 WARN  The `fmt` command is deprecated and will be removed in a future version of Terragrunt. Use `terragrunt run -- fmt` instead. 2025-09-27 00:01:38.471119 | orchestrator | 00:01:38.470 STDOUT terraform: ci.auto.tfvars 2025-09-27 00:01:39.383278 | orchestrator | 00:01:39.383 STDOUT terraform: default_custom.tf 2025-09-27 00:01:41.432596 | orchestrator | 00:01:41.432 WARN  The `TERRAGRUNT_TFPATH` environment variable is deprecated and will be removed in a future version of Terragrunt. Use `TG_TF_PATH=/home/zuul-testbed03/terraform` instead. 2025-09-27 00:01:42.332382 | orchestrator | 00:01:42.331 STDOUT terraform: data.openstack_networking_network_v2.public: Reading... 2025-09-27 00:01:42.877980 | orchestrator | 00:01:42.876 STDOUT terraform: data.openstack_networking_network_v2.public: Read complete after 1s [id=e6be7364-bfd8-4de7-8120-8f41c69a139a] 2025-09-27 00:01:43.062823 | orchestrator | 00:01:43.062 STDOUT terraform: OpenTofu used the selected providers to generate the following execution 2025-09-27 00:01:43.062893 | orchestrator | 00:01:43.062 STDOUT terraform: plan. Resource actions are indicated with the following symbols: 2025-09-27 00:01:43.062900 | orchestrator | 00:01:43.062 STDOUT terraform:  + create 2025-09-27 00:01:43.062907 | orchestrator | 00:01:43.062 STDOUT terraform:  <= read (data resources) 2025-09-27 00:01:43.062912 | orchestrator | 00:01:43.062 STDOUT terraform: OpenTofu will perform the following actions: 2025-09-27 00:01:43.062950 | orchestrator | 00:01:43.062 STDOUT terraform:  # data.openstack_images_image_v2.image will be read during apply 2025-09-27 00:01:43.062985 | orchestrator | 00:01:43.062 STDOUT terraform:  # (config refers to values not yet known) 2025-09-27 00:01:43.063023 | orchestrator | 00:01:43.062 STDOUT terraform:  <= data "openstack_images_image_v2" "image" { 2025-09-27 00:01:43.063058 | orchestrator | 00:01:43.063 STDOUT terraform:  + checksum = (known after apply) 2025-09-27 00:01:43.063102 | orchestrator | 00:01:43.063 STDOUT terraform:  + created_at = (known after apply) 2025-09-27 00:01:43.063140 | orchestrator | 00:01:43.063 STDOUT terraform:  + file = (known after apply) 2025-09-27 00:01:43.063202 | orchestrator | 00:01:43.063 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.063236 | orchestrator | 00:01:43.063 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.063265 | orchestrator | 00:01:43.063 STDOUT terraform:  + min_disk_gb = (known after apply) 2025-09-27 00:01:43.063302 | orchestrator | 00:01:43.063 STDOUT terraform:  + min_ram_mb = (known after apply) 2025-09-27 00:01:43.063324 | orchestrator | 00:01:43.063 STDOUT terraform:  + most_recent = true 2025-09-27 00:01:43.063370 | orchestrator | 00:01:43.063 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.063392 | orchestrator | 00:01:43.063 STDOUT terraform:  + protected = (known after apply) 2025-09-27 00:01:43.063426 | orchestrator | 00:01:43.063 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.063462 | orchestrator | 00:01:43.063 STDOUT terraform:  + schema = (known after apply) 2025-09-27 00:01:43.063500 | orchestrator | 00:01:43.063 STDOUT terraform:  + size_bytes = (known after apply) 2025-09-27 00:01:43.063537 | orchestrator | 00:01:43.063 STDOUT terraform:  + tags = (known after apply) 2025-09-27 00:01:43.063563 | orchestrator | 00:01:43.063 STDOUT terraform:  + updated_at = (known after apply) 2025-09-27 00:01:43.063582 | orchestrator | 00:01:43.063 STDOUT terraform:  } 2025-09-27 00:01:43.063638 | orchestrator | 00:01:43.063 STDOUT terraform:  # data.openstack_images_image_v2.image_node will be read during apply 2025-09-27 00:01:43.063669 | orchestrator | 00:01:43.063 STDOUT terraform:  # (config refers to values not yet known) 2025-09-27 00:01:43.063710 | orchestrator | 00:01:43.063 STDOUT terraform:  <= data "openstack_images_image_v2" "image_node" { 2025-09-27 00:01:43.063742 | orchestrator | 00:01:43.063 STDOUT terraform:  + checksum = (known after apply) 2025-09-27 00:01:43.063796 | orchestrator | 00:01:43.063 STDOUT terraform:  + created_at = (known after apply) 2025-09-27 00:01:43.063819 | orchestrator | 00:01:43.063 STDOUT terraform:  + file = (known after apply) 2025-09-27 00:01:43.063854 | orchestrator | 00:01:43.063 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.063882 | orchestrator | 00:01:43.063 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.063915 | orchestrator | 00:01:43.063 STDOUT terraform:  + min_disk_gb = (known after apply) 2025-09-27 00:01:43.063966 | orchestrator | 00:01:43.063 STDOUT terraform:  + min_ram_mb = (known after apply) 2025-09-27 00:01:43.063978 | orchestrator | 00:01:43.063 STDOUT terraform:  + most_recent = true 2025-09-27 00:01:43.063999 | orchestrator | 00:01:43.063 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.064038 | orchestrator | 00:01:43.063 STDOUT terraform:  + protected = (known after apply) 2025-09-27 00:01:43.064086 | orchestrator | 00:01:43.064 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.064127 | orchestrator | 00:01:43.064 STDOUT terraform:  + schema = (known after apply) 2025-09-27 00:01:43.064163 | orchestrator | 00:01:43.064 STDOUT terraform:  + size_bytes = (known after apply) 2025-09-27 00:01:43.064196 | orchestrator | 00:01:43.064 STDOUT terraform:  + tags = (known after apply) 2025-09-27 00:01:43.064241 | orchestrator | 00:01:43.064 STDOUT terraform:  + updated_at = (known after apply) 2025-09-27 00:01:43.064247 | orchestrator | 00:01:43.064 STDOUT terraform:  } 2025-09-27 00:01:43.064278 | orchestrator | 00:01:43.064 STDOUT terraform:  # local_file.MANAGER_ADDRESS will be created 2025-09-27 00:01:43.064329 | orchestrator | 00:01:43.064 STDOUT terraform:  + resource "local_file" "MANAGER_ADDRESS" { 2025-09-27 00:01:43.064355 | orchestrator | 00:01:43.064 STDOUT terraform:  + content = (known after apply) 2025-09-27 00:01:43.064430 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-09-27 00:01:43.064439 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-09-27 00:01:43.064490 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_md5 = (known after apply) 2025-09-27 00:01:43.064523 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_sha1 = (known after apply) 2025-09-27 00:01:43.064563 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_sha256 = (known after apply) 2025-09-27 00:01:43.064606 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_sha512 = (known after apply) 2025-09-27 00:01:43.064632 | orchestrator | 00:01:43.064 STDOUT terraform:  + directory_permission = "0777" 2025-09-27 00:01:43.064663 | orchestrator | 00:01:43.064 STDOUT terraform:  + file_permission = "0644" 2025-09-27 00:01:43.064705 | orchestrator | 00:01:43.064 STDOUT terraform:  + filename = ".MANAGER_ADDRESS.ci" 2025-09-27 00:01:43.064747 | orchestrator | 00:01:43.064 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.064763 | orchestrator | 00:01:43.064 STDOUT terraform:  } 2025-09-27 00:01:43.064792 | orchestrator | 00:01:43.064 STDOUT terraform:  # local_file.id_rsa_pub will be created 2025-09-27 00:01:43.064821 | orchestrator | 00:01:43.064 STDOUT terraform:  + resource "local_file" "id_rsa_pub" { 2025-09-27 00:01:43.064864 | orchestrator | 00:01:43.064 STDOUT terraform:  + content = (known after apply) 2025-09-27 00:01:43.064908 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-09-27 00:01:43.064951 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-09-27 00:01:43.064988 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_md5 = (known after apply) 2025-09-27 00:01:43.065039 | orchestrator | 00:01:43.064 STDOUT terraform:  + content_sha1 = (known after apply) 2025-09-27 00:01:43.065070 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_sha256 = (known after apply) 2025-09-27 00:01:43.065126 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_sha512 = (known after apply) 2025-09-27 00:01:43.065143 | orchestrator | 00:01:43.065 STDOUT terraform:  + directory_permission = "0777" 2025-09-27 00:01:43.065168 | orchestrator | 00:01:43.065 STDOUT terraform:  + file_permission = "0644" 2025-09-27 00:01:43.065214 | orchestrator | 00:01:43.065 STDOUT terraform:  + filename = ".id_rsa.ci.pub" 2025-09-27 00:01:43.065244 | orchestrator | 00:01:43.065 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.065256 | orchestrator | 00:01:43.065 STDOUT terraform:  } 2025-09-27 00:01:43.065305 | orchestrator | 00:01:43.065 STDOUT terraform:  # local_file.inventory will be created 2025-09-27 00:01:43.065313 | orchestrator | 00:01:43.065 STDOUT terraform:  + resource "local_file" "inventory" { 2025-09-27 00:01:43.065356 | orchestrator | 00:01:43.065 STDOUT terraform:  + content = (known after apply) 2025-09-27 00:01:43.065400 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-09-27 00:01:43.065440 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-09-27 00:01:43.065483 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_md5 = (known after apply) 2025-09-27 00:01:43.065522 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_sha1 = (known after apply) 2025-09-27 00:01:43.065563 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_sha256 = (known after apply) 2025-09-27 00:01:43.065603 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_sha512 = (known after apply) 2025-09-27 00:01:43.065648 | orchestrator | 00:01:43.065 STDOUT terraform:  + directory_permission = "0777" 2025-09-27 00:01:43.065655 | orchestrator | 00:01:43.065 STDOUT terraform:  + file_permission = "0644" 2025-09-27 00:01:43.065696 | orchestrator | 00:01:43.065 STDOUT terraform:  + filename = "inventory.ci" 2025-09-27 00:01:43.065744 | orchestrator | 00:01:43.065 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.065751 | orchestrator | 00:01:43.065 STDOUT terraform:  } 2025-09-27 00:01:43.065853 | orchestrator | 00:01:43.065 STDOUT terraform:  # local_sensitive_file.id_rsa will be created 2025-09-27 00:01:43.065905 | orchestrator | 00:01:43.065 STDOUT terraform:  + resource "local_sensitive_file" "id_rsa" { 2025-09-27 00:01:43.065930 | orchestrator | 00:01:43.065 STDOUT terraform:  + content = (sensitive value) 2025-09-27 00:01:43.065977 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_base64sha256 = (known after apply) 2025-09-27 00:01:43.066012 | orchestrator | 00:01:43.065 STDOUT terraform:  + content_base64sha512 = (known after apply) 2025-09-27 00:01:43.066067 | orchestrator | 00:01:43.066 STDOUT terraform:  + content_md5 = (known after apply) 2025-09-27 00:01:43.066127 | orchestrator | 00:01:43.066 STDOUT terraform:  + content_sha1 = (known after apply) 2025-09-27 00:01:43.066179 | orchestrator | 00:01:43.066 STDOUT terraform:  + content_sha256 = (known after apply) 2025-09-27 00:01:43.066222 | orchestrator | 00:01:43.066 STDOUT terraform:  + content_sha512 = (known after apply) 2025-09-27 00:01:43.066251 | orchestrator | 00:01:43.066 STDOUT terraform:  + directory_permission = "0700" 2025-09-27 00:01:43.066300 | orchestrator | 00:01:43.066 STDOUT terraform:  + file_permission = "0600" 2025-09-27 00:01:43.066308 | orchestrator | 00:01:43.066 STDOUT terraform:  + filename = ".id_rsa.ci" 2025-09-27 00:01:43.066350 | orchestrator | 00:01:43.066 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.066357 | orchestrator | 00:01:43.066 STDOUT terraform:  } 2025-09-27 00:01:43.066401 | orchestrator | 00:01:43.066 STDOUT terraform:  # null_resource.node_semaphore will be created 2025-09-27 00:01:43.066434 | orchestrator | 00:01:43.066 STDOUT terraform:  + resource "null_resource" "node_semaphore" { 2025-09-27 00:01:43.066473 | orchestrator | 00:01:43.066 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.066479 | orchestrator | 00:01:43.066 STDOUT terraform:  } 2025-09-27 00:01:43.066553 | orchestrator | 00:01:43.066 STDOUT terraform:  # openstack_blockstorage_volume_v3.manager_base_volume[0] will be created 2025-09-27 00:01:43.066599 | orchestrator | 00:01:43.066 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "manager_base_volume" { 2025-09-27 00:01:43.066638 | orchestrator | 00:01:43.066 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.066661 | orchestrator | 00:01:43.066 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.066710 | orchestrator | 00:01:43.066 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.066744 | orchestrator | 00:01:43.066 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.066785 | orchestrator | 00:01:43.066 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.066837 | orchestrator | 00:01:43.066 STDOUT terraform:  + name = "testbed-volume-manager-base" 2025-09-27 00:01:43.066898 | orchestrator | 00:01:43.066 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.066904 | orchestrator | 00:01:43.066 STDOUT terraform:  + size = 80 2025-09-27 00:01:43.066927 | orchestrator | 00:01:43.066 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.066955 | orchestrator | 00:01:43.066 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.066971 | orchestrator | 00:01:43.066 STDOUT terraform:  } 2025-09-27 00:01:43.067021 | orchestrator | 00:01:43.066 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[0] will be created 2025-09-27 00:01:43.067074 | orchestrator | 00:01:43.067 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-27 00:01:43.067117 | orchestrator | 00:01:43.067 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.067353 | orchestrator | 00:01:43.067 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.067459 | orchestrator | 00:01:43.067 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.067476 | orchestrator | 00:01:43.067 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.067487 | orchestrator | 00:01:43.067 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.067513 | orchestrator | 00:01:43.067 STDOUT terraform:  + name = "testbed-volume-0-node-base" 2025-09-27 00:01:43.067525 | orchestrator | 00:01:43.067 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.067536 | orchestrator | 00:01:43.067 STDOUT terraform:  + size = 80 2025-09-27 00:01:43.067547 | orchestrator | 00:01:43.067 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.067558 | orchestrator | 00:01:43.067 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.067569 | orchestrator | 00:01:43.067 STDOUT terraform:  } 2025-09-27 00:01:43.067580 | orchestrator | 00:01:43.067 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[1] will be created 2025-09-27 00:01:43.067596 | orchestrator | 00:01:43.067 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-27 00:01:43.067607 | orchestrator | 00:01:43.067 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.067639 | orchestrator | 00:01:43.067 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.067655 | orchestrator | 00:01:43.067 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.067666 | orchestrator | 00:01:43.067 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.067712 | orchestrator | 00:01:43.067 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.067754 | orchestrator | 00:01:43.067 STDOUT terraform:  + name = "testbed-volume-1-node-base" 2025-09-27 00:01:43.067798 | orchestrator | 00:01:43.067 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.067816 | orchestrator | 00:01:43.067 STDOUT terraform:  + size = 80 2025-09-27 00:01:43.067830 | orchestrator | 00:01:43.067 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.067869 | orchestrator | 00:01:43.067 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.067882 | orchestrator | 00:01:43.067 STDOUT terraform:  } 2025-09-27 00:01:43.067943 | orchestrator | 00:01:43.067 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[2] will be created 2025-09-27 00:01:43.067988 | orchestrator | 00:01:43.067 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-27 00:01:43.068027 | orchestrator | 00:01:43.067 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.068056 | orchestrator | 00:01:43.068 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.068085 | orchestrator | 00:01:43.068 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.068152 | orchestrator | 00:01:43.068 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.068170 | orchestrator | 00:01:43.068 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.068225 | orchestrator | 00:01:43.068 STDOUT terraform:  + name = "testbed-volume-2-node-base" 2025-09-27 00:01:43.068266 | orchestrator | 00:01:43.068 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.068283 | orchestrator | 00:01:43.068 STDOUT terraform:  + size = 80 2025-09-27 00:01:43.068318 | orchestrator | 00:01:43.068 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.068334 | orchestrator | 00:01:43.068 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.068345 | orchestrator | 00:01:43.068 STDOUT terraform:  } 2025-09-27 00:01:43.068398 | orchestrator | 00:01:43.068 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[3] will be created 2025-09-27 00:01:43.068448 | orchestrator | 00:01:43.068 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-27 00:01:43.068489 | orchestrator | 00:01:43.068 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.068506 | orchestrator | 00:01:43.068 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.068556 | orchestrator | 00:01:43.068 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.068601 | orchestrator | 00:01:43.068 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.068642 | orchestrator | 00:01:43.068 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.068709 | orchestrator | 00:01:43.068 STDOUT terraform:  + name = "testbed-volume-3-node-base" 2025-09-27 00:01:43.068728 | orchestrator | 00:01:43.068 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.068764 | orchestrator | 00:01:43.068 STDOUT terraform:  + size = 80 2025-09-27 00:01:43.068782 | orchestrator | 00:01:43.068 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.068796 | orchestrator | 00:01:43.068 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.068810 | orchestrator | 00:01:43.068 STDOUT terraform:  } 2025-09-27 00:01:43.068880 | orchestrator | 00:01:43.068 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[4] will be created 2025-09-27 00:01:43.068932 | orchestrator | 00:01:43.068 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-27 00:01:43.068950 | orchestrator | 00:01:43.068 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.068992 | orchestrator | 00:01:43.068 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.069032 | orchestrator | 00:01:43.068 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.069105 | orchestrator | 00:01:43.069 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.069120 | orchestrator | 00:01:43.069 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.069186 | orchestrator | 00:01:43.069 STDOUT terraform:  + name = "testbed-volume-4-node-base" 2025-09-27 00:01:43.069238 | orchestrator | 00:01:43.069 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.069251 | orchestrator | 00:01:43.069 STDOUT terraform:  + size = 80 2025-09-27 00:01:43.069266 | orchestrator | 00:01:43.069 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.069280 | orchestrator | 00:01:43.069 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.069295 | orchestrator | 00:01:43.069 STDOUT terraform:  } 2025-09-27 00:01:43.069354 | orchestrator | 00:01:43.069 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_base_volume[5] will be created 2025-09-27 00:01:43.069405 | orchestrator | 00:01:43.069 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_base_volume" { 2025-09-27 00:01:43.069421 | orchestrator | 00:01:43.069 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.069460 | orchestrator | 00:01:43.069 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.069521 | orchestrator | 00:01:43.069 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.069571 | orchestrator | 00:01:43.069 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.069587 | orchestrator | 00:01:43.069 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.069644 | orchestrator | 00:01:43.069 STDOUT terraform:  + name = "testbed-volume-5-node-base" 2025-09-27 00:01:43.069662 | orchestrator | 00:01:43.069 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.069711 | orchestrator | 00:01:43.069 STDOUT terraform:  + size = 80 2025-09-27 00:01:43.069733 | orchestrator | 00:01:43.069 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.069747 | orchestrator | 00:01:43.069 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.069758 | orchestrator | 00:01:43.069 STDOUT terraform:  } 2025-09-27 00:01:43.069798 | orchestrator | 00:01:43.069 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[0] will be created 2025-09-27 00:01:43.069851 | orchestrator | 00:01:43.069 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.069874 | orchestrator | 00:01:43.069 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.069924 | orchestrator | 00:01:43.069 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.069940 | orchestrator | 00:01:43.069 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.069979 | orchestrator | 00:01:43.069 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.070045 | orchestrator | 00:01:43.069 STDOUT terraform:  + name = "testbed-volume-0-node-3" 2025-09-27 00:01:43.070064 | orchestrator | 00:01:43.069 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.070079 | orchestrator | 00:01:43.070 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.070093 | orchestrator | 00:01:43.070 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.070107 | orchestrator | 00:01:43.070 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.070121 | orchestrator | 00:01:43.070 STDOUT terraform:  } 2025-09-27 00:01:43.070177 | orchestrator | 00:01:43.070 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[1] will be created 2025-09-27 00:01:43.070293 | orchestrator | 00:01:43.070 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.070312 | orchestrator | 00:01:43.070 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.070319 | orchestrator | 00:01:43.070 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.070327 | orchestrator | 00:01:43.070 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.070348 | orchestrator | 00:01:43.070 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.070388 | orchestrator | 00:01:43.070 STDOUT terraform:  + name = "testbed-volume-1-node-4" 2025-09-27 00:01:43.070453 | orchestrator | 00:01:43.070 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.070460 | orchestrator | 00:01:43.070 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.070466 | orchestrator | 00:01:43.070 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.070491 | orchestrator | 00:01:43.070 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.070497 | orchestrator | 00:01:43.070 STDOUT terraform:  } 2025-09-27 00:01:43.070548 | orchestrator | 00:01:43.070 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[2] will be created 2025-09-27 00:01:43.070593 | orchestrator | 00:01:43.070 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.070629 | orchestrator | 00:01:43.070 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.070654 | orchestrator | 00:01:43.070 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.070693 | orchestrator | 00:01:43.070 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.070730 | orchestrator | 00:01:43.070 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.070770 | orchestrator | 00:01:43.070 STDOUT terraform:  + name = "testbed-volume-2-node-5" 2025-09-27 00:01:43.070808 | orchestrator | 00:01:43.070 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.070830 | orchestrator | 00:01:43.070 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.070861 | orchestrator | 00:01:43.070 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.070879 | orchestrator | 00:01:43.070 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.070885 | orchestrator | 00:01:43.070 STDOUT terraform:  } 2025-09-27 00:01:43.070937 | orchestrator | 00:01:43.070 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[3] will be created 2025-09-27 00:01:43.070981 | orchestrator | 00:01:43.070 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.071019 | orchestrator | 00:01:43.070 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.071043 | orchestrator | 00:01:43.071 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.071082 | orchestrator | 00:01:43.071 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.071119 | orchestrator | 00:01:43.071 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.071182 | orchestrator | 00:01:43.071 STDOUT terraform:  + name = "testbed-volume-3-node-3" 2025-09-27 00:01:43.071219 | orchestrator | 00:01:43.071 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.071243 | orchestrator | 00:01:43.071 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.071267 | orchestrator | 00:01:43.071 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.071290 | orchestrator | 00:01:43.071 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.071297 | orchestrator | 00:01:43.071 STDOUT terraform:  } 2025-09-27 00:01:43.071341 | orchestrator | 00:01:43.071 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[4] will be created 2025-09-27 00:01:43.071385 | orchestrator | 00:01:43.071 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.071422 | orchestrator | 00:01:43.071 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.071446 | orchestrator | 00:01:43.071 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.071481 | orchestrator | 00:01:43.071 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.071515 | orchestrator | 00:01:43.071 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.071553 | orchestrator | 00:01:43.071 STDOUT terraform:  + name = "testbed-volume-4-node-4" 2025-09-27 00:01:43.071586 | orchestrator | 00:01:43.071 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.071609 | orchestrator | 00:01:43.071 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.071630 | orchestrator | 00:01:43.071 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.071653 | orchestrator | 00:01:43.071 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.071660 | orchestrator | 00:01:43.071 STDOUT terraform:  } 2025-09-27 00:01:43.071706 | orchestrator | 00:01:43.071 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[5] will be created 2025-09-27 00:01:43.071748 | orchestrator | 00:01:43.071 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.071781 | orchestrator | 00:01:43.071 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.071804 | orchestrator | 00:01:43.071 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.071840 | orchestrator | 00:01:43.071 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.071873 | orchestrator | 00:01:43.071 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.071909 | orchestrator | 00:01:43.071 STDOUT terraform:  + name = "testbed-volume-5-node-5" 2025-09-27 00:01:43.071943 | orchestrator | 00:01:43.071 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.071967 | orchestrator | 00:01:43.071 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.071987 | orchestrator | 00:01:43.071 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.072010 | orchestrator | 00:01:43.071 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.072017 | orchestrator | 00:01:43.072 STDOUT terraform:  } 2025-09-27 00:01:43.072062 | orchestrator | 00:01:43.072 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[6] will be created 2025-09-27 00:01:43.072102 | orchestrator | 00:01:43.072 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.072144 | orchestrator | 00:01:43.072 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.072164 | orchestrator | 00:01:43.072 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.072198 | orchestrator | 00:01:43.072 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.072233 | orchestrator | 00:01:43.072 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.072281 | orchestrator | 00:01:43.072 STDOUT terraform:  + name = "testbed-volume-6-node-3" 2025-09-27 00:01:43.072315 | orchestrator | 00:01:43.072 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.072336 | orchestrator | 00:01:43.072 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.072358 | orchestrator | 00:01:43.072 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.072382 | orchestrator | 00:01:43.072 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.072389 | orchestrator | 00:01:43.072 STDOUT terraform:  } 2025-09-27 00:01:43.072433 | orchestrator | 00:01:43.072 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[7] will be created 2025-09-27 00:01:43.072474 | orchestrator | 00:01:43.072 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.072508 | orchestrator | 00:01:43.072 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.072530 | orchestrator | 00:01:43.072 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.072566 | orchestrator | 00:01:43.072 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.072601 | orchestrator | 00:01:43.072 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.072637 | orchestrator | 00:01:43.072 STDOUT terraform:  + name = "testbed-volume-7-node-4" 2025-09-27 00:01:43.072671 | orchestrator | 00:01:43.072 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.072691 | orchestrator | 00:01:43.072 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.072715 | orchestrator | 00:01:43.072 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.072739 | orchestrator | 00:01:43.072 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.072746 | orchestrator | 00:01:43.072 STDOUT terraform:  } 2025-09-27 00:01:43.072790 | orchestrator | 00:01:43.072 STDOUT terraform:  # openstack_blockstorage_volume_v3.node_volume[8] will be created 2025-09-27 00:01:43.072831 | orchestrator | 00:01:43.072 STDOUT terraform:  + resource "openstack_blockstorage_volume_v3" "node_volume" { 2025-09-27 00:01:43.072865 | orchestrator | 00:01:43.072 STDOUT terraform:  + attachment = (known after apply) 2025-09-27 00:01:43.072890 | orchestrator | 00:01:43.072 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.072925 | orchestrator | 00:01:43.072 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.072961 | orchestrator | 00:01:43.072 STDOUT terraform:  + metadata = (known after apply) 2025-09-27 00:01:43.072998 | orchestrator | 00:01:43.072 STDOUT terraform:  + name = "testbed-volume-8-node-5" 2025-09-27 00:01:43.073032 | orchestrator | 00:01:43.072 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.073084 | orchestrator | 00:01:43.073 STDOUT terraform:  + size = 20 2025-09-27 00:01:43.073110 | orchestrator | 00:01:43.073 STDOUT terraform:  + volume_retype_policy = "never" 2025-09-27 00:01:43.073116 | orchestrator | 00:01:43.073 STDOUT terraform:  + volume_type = "ssd" 2025-09-27 00:01:43.073148 | orchestrator | 00:01:43.073 STDOUT terraform:  } 2025-09-27 00:01:43.073196 | orchestrator | 00:01:43.073 STDOUT terraform:  # openstack_compute_instance_v2.manager_server will be created 2025-09-27 00:01:43.073235 | orchestrator | 00:01:43.073 STDOUT terraform:  + resource "openstack_compute_instance_v2" "manager_server" { 2025-09-27 00:01:43.073274 | orchestrator | 00:01:43.073 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-27 00:01:43.073308 | orchestrator | 00:01:43.073 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-27 00:01:43.073365 | orchestrator | 00:01:43.073 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-27 00:01:43.073399 | orchestrator | 00:01:43.073 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.073424 | orchestrator | 00:01:43.073 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.073436 | orchestrator | 00:01:43.073 STDOUT terraform:  + config_drive = true 2025-09-27 00:01:43.073476 | orchestrator | 00:01:43.073 STDOUT terraform:  + created = (known after apply) 2025-09-27 00:01:43.073510 | orchestrator | 00:01:43.073 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-27 00:01:43.073539 | orchestrator | 00:01:43.073 STDOUT terraform:  + flavor_name = "OSISM-4V-16" 2025-09-27 00:01:43.073562 | orchestrator | 00:01:43.073 STDOUT terraform:  + force_delete = false 2025-09-27 00:01:43.073595 | orchestrator | 00:01:43.073 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-27 00:01:43.073630 | orchestrator | 00:01:43.073 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.073666 | orchestrator | 00:01:43.073 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.073700 | orchestrator | 00:01:43.073 STDOUT terraform:  + image_name = (known after apply) 2025-09-27 00:01:43.073725 | orchestrator | 00:01:43.073 STDOUT terraform:  + key_pair = "testbed" 2025-09-27 00:01:43.073757 | orchestrator | 00:01:43.073 STDOUT terraform:  + name = "testbed-manager" 2025-09-27 00:01:43.073781 | orchestrator | 00:01:43.073 STDOUT terraform:  + power_state = "active" 2025-09-27 00:01:43.073816 | orchestrator | 00:01:43.073 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.073849 | orchestrator | 00:01:43.073 STDOUT terraform:  + security_groups = (known after apply) 2025-09-27 00:01:43.073872 | orchestrator | 00:01:43.073 STDOUT terraform:  + stop_before_destroy = false 2025-09-27 00:01:43.073906 | orchestrator | 00:01:43.073 STDOUT terraform:  + updated = (known after apply) 2025-09-27 00:01:43.073936 | orchestrator | 00:01:43.073 STDOUT terraform:  + user_data = (sensitive value) 2025-09-27 00:01:43.073942 | orchestrator | 00:01:43.073 STDOUT terraform:  + block_device { 2025-09-27 00:01:43.073972 | orchestrator | 00:01:43.073 STDOUT terraform:  + boot_index = 0 2025-09-27 00:01:43.074000 | orchestrator | 00:01:43.073 STDOUT terraform:  + delete_on_termination = false 2025-09-27 00:01:43.074044 | orchestrator | 00:01:43.073 STDOUT terraform:  + destination_type = "volume" 2025-09-27 00:01:43.074071 | orchestrator | 00:01:43.074 STDOUT terraform:  + multiattach = false 2025-09-27 00:01:43.074100 | orchestrator | 00:01:43.074 STDOUT terraform:  + source_type = "volume" 2025-09-27 00:01:43.074149 | orchestrator | 00:01:43.074 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.074156 | orchestrator | 00:01:43.074 STDOUT terraform:  } 2025-09-27 00:01:43.074162 | orchestrator | 00:01:43.074 STDOUT terraform:  + network { 2025-09-27 00:01:43.074283 | orchestrator | 00:01:43.074 STDOUT terraform:  + access_network = false 2025-09-27 00:01:43.074322 | orchestrator | 00:01:43.074 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-27 00:01:43.074336 | orchestrator | 00:01:43.074 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-27 00:01:43.074354 | orchestrator | 00:01:43.074 STDOUT terraform:  + mac = (known after apply) 2025-09-27 00:01:43.074377 | orchestrator | 00:01:43.074 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.074387 | orchestrator | 00:01:43.074 STDOUT terraform:  + port = (known after apply) 2025-09-27 00:01:43.074397 | orchestrator | 00:01:43.074 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.074406 | orchestrator | 00:01:43.074 STDOUT terraform:  } 2025-09-27 00:01:43.074420 | orchestrator | 00:01:43.074 STDOUT terraform:  } 2025-09-27 00:01:43.074431 | orchestrator | 00:01:43.074 STDOUT terraform:  # openstack_compute_instance_v2.node_server[0] will be created 2025-09-27 00:01:43.074444 | orchestrator | 00:01:43.074 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-27 00:01:43.074484 | orchestrator | 00:01:43.074 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-27 00:01:43.074506 | orchestrator | 00:01:43.074 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-27 00:01:43.074542 | orchestrator | 00:01:43.074 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-27 00:01:43.074557 | orchestrator | 00:01:43.074 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.074591 | orchestrator | 00:01:43.074 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.074605 | orchestrator | 00:01:43.074 STDOUT terraform:  + config_drive = true 2025-09-27 00:01:43.074639 | orchestrator | 00:01:43.074 STDOUT terraform:  + created = (known after apply) 2025-09-27 00:01:43.074654 | orchestrator | 00:01:43.074 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-27 00:01:43.074698 | orchestrator | 00:01:43.074 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-27 00:01:43.074714 | orchestrator | 00:01:43.074 STDOUT terraform:  + force_delete = false 2025-09-27 00:01:43.074762 | orchestrator | 00:01:43.074 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-27 00:01:43.074779 | orchestrator | 00:01:43.074 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.074816 | orchestrator | 00:01:43.074 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.074832 | orchestrator | 00:01:43.074 STDOUT terraform:  + image_name = (known after apply) 2025-09-27 00:01:43.074880 | orchestrator | 00:01:43.074 STDOUT terraform:  + key_pair = "testbed" 2025-09-27 00:01:43.074893 | orchestrator | 00:01:43.074 STDOUT terraform:  + name = "testbed-node-0" 2025-09-27 00:01:43.074907 | orchestrator | 00:01:43.074 STDOUT terraform:  + power_state = "active" 2025-09-27 00:01:43.074945 | orchestrator | 00:01:43.074 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.074962 | orchestrator | 00:01:43.074 STDOUT terraform:  + security_groups = (known after apply) 2025-09-27 00:01:43.075000 | orchestrator | 00:01:43.074 STDOUT terraform:  + stop_before_destroy = false 2025-09-27 00:01:43.075016 | orchestrator | 00:01:43.074 STDOUT terraform:  + updated = (known after apply) 2025-09-27 00:01:43.075079 | orchestrator | 00:01:43.075 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-27 00:01:43.075093 | orchestrator | 00:01:43.075 STDOUT terraform:  + block_device { 2025-09-27 00:01:43.075118 | orchestrator | 00:01:43.075 STDOUT terraform:  + boot_index = 0 2025-09-27 00:01:43.075158 | orchestrator | 00:01:43.075 STDOUT terraform:  + delete_on_termination = false 2025-09-27 00:01:43.075173 | orchestrator | 00:01:43.075 STDOUT terraform:  + destination_type = "volume" 2025-09-27 00:01:43.075187 | orchestrator | 00:01:43.075 STDOUT terraform:  + multiattach = false 2025-09-27 00:01:43.075229 | orchestrator | 00:01:43.075 STDOUT terraform:  + source_type = "volume" 2025-09-27 00:01:43.075245 | orchestrator | 00:01:43.075 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.075259 | orchestrator | 00:01:43.075 STDOUT terraform:  } 2025-09-27 00:01:43.075273 | orchestrator | 00:01:43.075 STDOUT terraform:  + network { 2025-09-27 00:01:43.075287 | orchestrator | 00:01:43.075 STDOUT terraform:  + access_network = false 2025-09-27 00:01:43.075325 | orchestrator | 00:01:43.075 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-27 00:01:43.075341 | orchestrator | 00:01:43.075 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-27 00:01:43.075377 | orchestrator | 00:01:43.075 STDOUT terraform:  + mac = (known after apply) 2025-09-27 00:01:43.075393 | orchestrator | 00:01:43.075 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.075430 | orchestrator | 00:01:43.075 STDOUT terraform:  + port = (known after apply) 2025-09-27 00:01:43.075446 | orchestrator | 00:01:43.075 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.075460 | orchestrator | 00:01:43.075 STDOUT terraform:  } 2025-09-27 00:01:43.075474 | orchestrator | 00:01:43.075 STDOUT terraform:  } 2025-09-27 00:01:43.075522 | orchestrator | 00:01:43.075 STDOUT terraform:  # openstack_compute_instance_v2.node_server[1] will be created 2025-09-27 00:01:43.075562 | orchestrator | 00:01:43.075 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-27 00:01:43.075579 | orchestrator | 00:01:43.075 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-27 00:01:43.075626 | orchestrator | 00:01:43.075 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-27 00:01:43.075642 | orchestrator | 00:01:43.075 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-27 00:01:43.075691 | orchestrator | 00:01:43.075 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.075709 | orchestrator | 00:01:43.075 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.075723 | orchestrator | 00:01:43.075 STDOUT terraform:  + config_drive = true 2025-09-27 00:01:43.075761 | orchestrator | 00:01:43.075 STDOUT terraform:  + created = (known after apply) 2025-09-27 00:01:43.075777 | orchestrator | 00:01:43.075 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-27 00:01:43.075814 | orchestrator | 00:01:43.075 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-27 00:01:43.075830 | orchestrator | 00:01:43.075 STDOUT terraform:  + force_delete = false 2025-09-27 00:01:43.075850 | orchestrator | 00:01:43.075 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-27 00:01:43.075910 | orchestrator | 00:01:43.075 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.075923 | orchestrator | 00:01:43.075 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.075938 | orchestrator | 00:01:43.075 STDOUT terraform:  + image_name = (known after apply) 2025-09-27 00:01:43.075976 | orchestrator | 00:01:43.075 STDOUT terraform:  + key_pair = "testbed" 2025-09-27 00:01:43.075993 | orchestrator | 00:01:43.075 STDOUT terraform:  + name = "testbed-node-1" 2025-09-27 00:01:43.076007 | orchestrator | 00:01:43.075 STDOUT terraform:  + power_state = "active" 2025-09-27 00:01:43.076055 | orchestrator | 00:01:43.076 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.076072 | orchestrator | 00:01:43.076 STDOUT terraform:  + security_groups = (known after apply) 2025-09-27 00:01:43.076086 | orchestrator | 00:01:43.076 STDOUT terraform:  + stop_before_destroy = false 2025-09-27 00:01:43.076148 | orchestrator | 00:01:43.076 STDOUT terraform:  + updated = (known after apply) 2025-09-27 00:01:43.076189 | orchestrator | 00:01:43.076 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-27 00:01:43.076202 | orchestrator | 00:01:43.076 STDOUT terraform:  + block_device { 2025-09-27 00:01:43.076216 | orchestrator | 00:01:43.076 STDOUT terraform:  + boot_index = 0 2025-09-27 00:01:43.076235 | orchestrator | 00:01:43.076 STDOUT terraform:  + delete_on_termination = false 2025-09-27 00:01:43.076250 | orchestrator | 00:01:43.076 STDOUT terraform:  + destination_type = "volume" 2025-09-27 00:01:43.076290 | orchestrator | 00:01:43.076 STDOUT terraform:  + multiattach = false 2025-09-27 00:01:43.076306 | orchestrator | 00:01:43.076 STDOUT terraform:  + source_type = "volume" 2025-09-27 00:01:43.076354 | orchestrator | 00:01:43.076 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.076368 | orchestrator | 00:01:43.076 STDOUT terraform:  } 2025-09-27 00:01:43.076382 | orchestrator | 00:01:43.076 STDOUT terraform:  + network { 2025-09-27 00:01:43.076393 | orchestrator | 00:01:43.076 STDOUT terraform:  + access_network = false 2025-09-27 00:01:43.076407 | orchestrator | 00:01:43.076 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-27 00:01:43.076444 | orchestrator | 00:01:43.076 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-27 00:01:43.076460 | orchestrator | 00:01:43.076 STDOUT terraform:  + mac = (known after apply) 2025-09-27 00:01:43.076497 | orchestrator | 00:01:43.076 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.076513 | orchestrator | 00:01:43.076 STDOUT terraform:  + port = (known after apply) 2025-09-27 00:01:43.076562 | orchestrator | 00:01:43.076 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.076575 | orchestrator | 00:01:43.076 STDOUT terraform:  } 2025-09-27 00:01:43.076586 | orchestrator | 00:01:43.076 STDOUT terraform:  } 2025-09-27 00:01:43.076600 | orchestrator | 00:01:43.076 STDOUT terraform:  # openstack_compute_instance_v2.node_server[2] will be created 2025-09-27 00:01:43.076647 | orchestrator | 00:01:43.076 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-27 00:01:43.076670 | orchestrator | 00:01:43.076 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-27 00:01:43.076710 | orchestrator | 00:01:43.076 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-27 00:01:43.076726 | orchestrator | 00:01:43.076 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-27 00:01:43.076774 | orchestrator | 00:01:43.076 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.076790 | orchestrator | 00:01:43.076 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.076802 | orchestrator | 00:01:43.076 STDOUT terraform:  + config_drive = true 2025-09-27 00:01:43.076839 | orchestrator | 00:01:43.076 STDOUT terraform:  + created = (known after apply) 2025-09-27 00:01:43.076854 | orchestrator | 00:01:43.076 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-27 00:01:43.076892 | orchestrator | 00:01:43.076 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-27 00:01:43.076908 | orchestrator | 00:01:43.076 STDOUT terraform:  + force_delete = false 2025-09-27 00:01:43.076935 | orchestrator | 00:01:43.076 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-27 00:01:43.076973 | orchestrator | 00:01:43.076 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.076989 | orchestrator | 00:01:43.076 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.077039 | orchestrator | 00:01:43.076 STDOUT terraform:  + image_name = (known after apply) 2025-09-27 00:01:43.077055 | orchestrator | 00:01:43.077 STDOUT terraform:  + key_pair = "testbed" 2025-09-27 00:01:43.077069 | orchestrator | 00:01:43.077 STDOUT terraform:  + name = "testbed-node-2" 2025-09-27 00:01:43.077107 | orchestrator | 00:01:43.077 STDOUT terraform:  + power_state = "active" 2025-09-27 00:01:43.077122 | orchestrator | 00:01:43.077 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.077174 | orchestrator | 00:01:43.077 STDOUT terraform:  + security_groups = (known after apply) 2025-09-27 00:01:43.077190 | orchestrator | 00:01:43.077 STDOUT terraform:  + stop_before_destroy = false 2025-09-27 00:01:43.077215 | orchestrator | 00:01:43.077 STDOUT terraform:  + updated = (known after apply) 2025-09-27 00:01:43.077274 | orchestrator | 00:01:43.077 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-27 00:01:43.077287 | orchestrator | 00:01:43.077 STDOUT terraform:  + block_device { 2025-09-27 00:01:43.077301 | orchestrator | 00:01:43.077 STDOUT terraform:  + boot_index = 0 2025-09-27 00:01:43.077315 | orchestrator | 00:01:43.077 STDOUT terraform:  + delete_on_termination = false 2025-09-27 00:01:43.077359 | orchestrator | 00:01:43.077 STDOUT terraform:  + destination_type = "volume" 2025-09-27 00:01:43.077375 | orchestrator | 00:01:43.077 STDOUT terraform:  + multiattach = false 2025-09-27 00:01:43.077389 | orchestrator | 00:01:43.077 STDOUT terraform:  + source_type = "volume" 2025-09-27 00:01:43.077428 | orchestrator | 00:01:43.077 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.077448 | orchestrator | 00:01:43.077 STDOUT terraform:  } 2025-09-27 00:01:43.077462 | orchestrator | 00:01:43.077 STDOUT terraform:  + network { 2025-09-27 00:01:43.077473 | orchestrator | 00:01:43.077 STDOUT terraform:  + access_network = false 2025-09-27 00:01:43.077487 | orchestrator | 00:01:43.077 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-27 00:01:43.077501 | orchestrator | 00:01:43.077 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-27 00:01:43.077540 | orchestrator | 00:01:43.077 STDOUT terraform:  + mac = (known after apply) 2025-09-27 00:01:43.077556 | orchestrator | 00:01:43.077 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.077593 | orchestrator | 00:01:43.077 STDOUT terraform:  + port = (known after apply) 2025-09-27 00:01:43.077609 | orchestrator | 00:01:43.077 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.077623 | orchestrator | 00:01:43.077 STDOUT terraform:  } 2025-09-27 00:01:43.077638 | orchestrator | 00:01:43.077 STDOUT terraform:  } 2025-09-27 00:01:43.077676 | orchestrator | 00:01:43.077 STDOUT terraform:  # openstack_compute_instance_v2.node_server[3] will be created 2025-09-27 00:01:43.077716 | orchestrator | 00:01:43.077 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-27 00:01:43.077732 | orchestrator | 00:01:43.077 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-27 00:01:43.077781 | orchestrator | 00:01:43.077 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-27 00:01:43.077798 | orchestrator | 00:01:43.077 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-27 00:01:43.077846 | orchestrator | 00:01:43.077 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.077862 | orchestrator | 00:01:43.077 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.077873 | orchestrator | 00:01:43.077 STDOUT terraform:  + config_drive = true 2025-09-27 00:01:43.077911 | orchestrator | 00:01:43.077 STDOUT terraform:  + created = (known after apply) 2025-09-27 00:01:43.077927 | orchestrator | 00:01:43.077 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-27 00:01:43.077964 | orchestrator | 00:01:43.077 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-27 00:01:43.077980 | orchestrator | 00:01:43.077 STDOUT terraform:  + force_delete = false 2025-09-27 00:01:43.078027 | orchestrator | 00:01:43.077 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-27 00:01:43.078044 | orchestrator | 00:01:43.077 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.078090 | orchestrator | 00:01:43.078 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.078152 | orchestrator | 00:01:43.078 STDOUT terraform:  + image_name = (known after apply) 2025-09-27 00:01:43.078170 | orchestrator | 00:01:43.078 STDOUT terraform:  + key_pair = "testbed" 2025-09-27 00:01:43.078181 | orchestrator | 00:01:43.078 STDOUT terraform:  + name = "testbed-node-3" 2025-09-27 00:01:43.078195 | orchestrator | 00:01:43.078 STDOUT terraform:  + power_state = "active" 2025-09-27 00:01:43.078222 | orchestrator | 00:01:43.078 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.078288 | orchestrator | 00:01:43.078 STDOUT terraform:  + security_groups = (known after apply) 2025-09-27 00:01:43.078306 | orchestrator | 00:01:43.078 STDOUT terraform:  + stop_before_destroy = false 2025-09-27 00:01:43.078317 | orchestrator | 00:01:43.078 STDOUT terraform:  + updated = (known after apply) 2025-09-27 00:01:43.078368 | orchestrator | 00:01:43.078 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-27 00:01:43.078381 | orchestrator | 00:01:43.078 STDOUT terraform:  + block_device { 2025-09-27 00:01:43.078395 | orchestrator | 00:01:43.078 STDOUT terraform:  + boot_index = 0 2025-09-27 00:01:43.078406 | orchestrator | 00:01:43.078 STDOUT terraform:  + delete_on_termination = false 2025-09-27 00:01:43.078420 | orchestrator | 00:01:43.078 STDOUT terraform:  + destination_type = "volume" 2025-09-27 00:01:43.078458 | orchestrator | 00:01:43.078 STDOUT terraform:  + multiattach = false 2025-09-27 00:01:43.078473 | orchestrator | 00:01:43.078 STDOUT terraform:  + source_type = "volume" 2025-09-27 00:01:43.078512 | orchestrator | 00:01:43.078 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.078524 | orchestrator | 00:01:43.078 STDOUT terraform:  } 2025-09-27 00:01:43.078539 | orchestrator | 00:01:43.078 STDOUT terraform:  + network { 2025-09-27 00:01:43.078549 | orchestrator | 00:01:43.078 STDOUT terraform:  + access_network = false 2025-09-27 00:01:43.078564 | orchestrator | 00:01:43.078 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-27 00:01:43.078601 | orchestrator | 00:01:43.078 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-27 00:01:43.078617 | orchestrator | 00:01:43.078 STDOUT terraform:  + mac = (known after apply) 2025-09-27 00:01:43.078666 | orchestrator | 00:01:43.078 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.078679 | orchestrator | 00:01:43.078 STDOUT terraform:  + port = (known after apply) 2025-09-27 00:01:43.078693 | orchestrator | 00:01:43.078 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.078707 | orchestrator | 00:01:43.078 STDOUT terraform:  } 2025-09-27 00:01:43.078721 | orchestrator | 00:01:43.078 STDOUT terraform:  } 2025-09-27 00:01:43.078770 | orchestrator | 00:01:43.078 STDOUT terraform:  # openstack_compute_instance_v2.node_server[4] will be created 2025-09-27 00:01:43.078820 | orchestrator | 00:01:43.078 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-27 00:01:43.078836 | orchestrator | 00:01:43.078 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-27 00:01:43.078884 | orchestrator | 00:01:43.078 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-27 00:01:43.078901 | orchestrator | 00:01:43.078 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-27 00:01:43.078948 | orchestrator | 00:01:43.078 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.078961 | orchestrator | 00:01:43.078 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.078983 | orchestrator | 00:01:43.078 STDOUT terraform:  + config_drive = true 2025-09-27 00:01:43.079014 | orchestrator | 00:01:43.078 STDOUT terraform:  + created = (known after apply) 2025-09-27 00:01:43.079029 | orchestrator | 00:01:43.078 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-27 00:01:43.079043 | orchestrator | 00:01:43.079 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-27 00:01:43.079070 | orchestrator | 00:01:43.079 STDOUT terraform:  + force_delete = false 2025-09-27 00:01:43.079111 | orchestrator | 00:01:43.079 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-27 00:01:43.079127 | orchestrator | 00:01:43.079 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.079193 | orchestrator | 00:01:43.079 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.079216 | orchestrator | 00:01:43.079 STDOUT terraform:  + image_name = (known after apply) 2025-09-27 00:01:43.079231 | orchestrator | 00:01:43.079 STDOUT terraform:  + key_pair = "testbed" 2025-09-27 00:01:43.079270 | orchestrator | 00:01:43.079 STDOUT terraform:  + name = "testbed-node-4" 2025-09-27 00:01:43.079287 | orchestrator | 00:01:43.079 STDOUT terraform:  + power_state = "active" 2025-09-27 00:01:43.079335 | orchestrator | 00:01:43.079 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.079352 | orchestrator | 00:01:43.079 STDOUT terraform:  + security_groups = (known after apply) 2025-09-27 00:01:43.079366 | orchestrator | 00:01:43.079 STDOUT terraform:  + stop_before_destroy = false 2025-09-27 00:01:43.079404 | orchestrator | 00:01:43.079 STDOUT terraform:  + updated = (known after apply) 2025-09-27 00:01:43.079444 | orchestrator | 00:01:43.079 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-27 00:01:43.079456 | orchestrator | 00:01:43.079 STDOUT terraform:  + block_device { 2025-09-27 00:01:43.079470 | orchestrator | 00:01:43.079 STDOUT terraform:  + boot_index = 0 2025-09-27 00:01:43.079484 | orchestrator | 00:01:43.079 STDOUT terraform:  + delete_on_termination = false 2025-09-27 00:01:43.079521 | orchestrator | 00:01:43.079 STDOUT terraform:  + destination_type = "volume" 2025-09-27 00:01:43.079537 | orchestrator | 00:01:43.079 STDOUT terraform:  + multiattach = false 2025-09-27 00:01:43.079574 | orchestrator | 00:01:43.079 STDOUT terraform:  + source_type = "volume" 2025-09-27 00:01:43.079590 | orchestrator | 00:01:43.079 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.079605 | orchestrator | 00:01:43.079 STDOUT terraform:  } 2025-09-27 00:01:43.079619 | orchestrator | 00:01:43.079 STDOUT terraform:  + network { 2025-09-27 00:01:43.079633 | orchestrator | 00:01:43.079 STDOUT terraform:  + access_network = false 2025-09-27 00:01:43.079671 | orchestrator | 00:01:43.079 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-27 00:01:43.079687 | orchestrator | 00:01:43.079 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-27 00:01:43.079724 | orchestrator | 00:01:43.079 STDOUT terraform:  + mac = (known after apply) 2025-09-27 00:01:43.079747 | orchestrator | 00:01:43.079 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.079761 | orchestrator | 00:01:43.079 STDOUT terraform:  + port = (known after apply) 2025-09-27 00:01:43.079801 | orchestrator | 00:01:43.079 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.079813 | orchestrator | 00:01:43.079 STDOUT terraform:  } 2025-09-27 00:01:43.079828 | orchestrator | 00:01:43.079 STDOUT terraform:  } 2025-09-27 00:01:43.079842 | orchestrator | 00:01:43.079 STDOUT terraform:  # openstack_compute_instance_v2.node_server[5] will be created 2025-09-27 00:01:43.079950 | orchestrator | 00:01:43.079 STDOUT terraform:  + resource "openstack_compute_instance_v2" "node_server" { 2025-09-27 00:01:43.079968 | orchestrator | 00:01:43.079 STDOUT terraform:  + access_ip_v4 = (known after apply) 2025-09-27 00:01:43.079976 | orchestrator | 00:01:43.079 STDOUT terraform:  + access_ip_v6 = (known after apply) 2025-09-27 00:01:43.079980 | orchestrator | 00:01:43.079 STDOUT terraform:  + all_metadata = (known after apply) 2025-09-27 00:01:43.080016 | orchestrator | 00:01:43.079 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.080035 | orchestrator | 00:01:43.080 STDOUT terraform:  + availability_zone = "nova" 2025-09-27 00:01:43.080053 | orchestrator | 00:01:43.080 STDOUT terraform:  + config_drive = true 2025-09-27 00:01:43.080085 | orchestrator | 00:01:43.080 STDOUT terraform:  + created = (known after apply) 2025-09-27 00:01:43.080118 | orchestrator | 00:01:43.080 STDOUT terraform:  + flavor_id = (known after apply) 2025-09-27 00:01:43.080153 | orchestrator | 00:01:43.080 STDOUT terraform:  + flavor_name = "OSISM-8V-32" 2025-09-27 00:01:43.080178 | orchestrator | 00:01:43.080 STDOUT terraform:  + force_delete = false 2025-09-27 00:01:43.080212 | orchestrator | 00:01:43.080 STDOUT terraform:  + hypervisor_hostname = (known after apply) 2025-09-27 00:01:43.080246 | orchestrator | 00:01:43.080 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.080279 | orchestrator | 00:01:43.080 STDOUT terraform:  + image_id = (known after apply) 2025-09-27 00:01:43.080313 | orchestrator | 00:01:43.080 STDOUT terraform:  + image_name = (known after apply) 2025-09-27 00:01:43.080337 | orchestrator | 00:01:43.080 STDOUT terraform:  + key_pair = "testbed" 2025-09-27 00:01:43.080367 | orchestrator | 00:01:43.080 STDOUT terraform:  + name = "testbed-node-5" 2025-09-27 00:01:43.080385 | orchestrator | 00:01:43.080 STDOUT terraform:  + power_state = "active" 2025-09-27 00:01:43.080421 | orchestrator | 00:01:43.080 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.080454 | orchestrator | 00:01:43.080 STDOUT terraform:  + security_groups = (known after apply) 2025-09-27 00:01:43.080473 | orchestrator | 00:01:43.080 STDOUT terraform:  + stop_before_destroy = false 2025-09-27 00:01:43.080508 | orchestrator | 00:01:43.080 STDOUT terraform:  + updated = (known after apply) 2025-09-27 00:01:43.080556 | orchestrator | 00:01:43.080 STDOUT terraform:  + user_data = "ae09e46b224a6ca206a9ed4f8f8a4f8520827854" 2025-09-27 00:01:43.080563 | orchestrator | 00:01:43.080 STDOUT terraform:  + block_device { 2025-09-27 00:01:43.080588 | orchestrator | 00:01:43.080 STDOUT terraform:  + boot_index = 0 2025-09-27 00:01:43.080614 | orchestrator | 00:01:43.080 STDOUT terraform:  + delete_on_termination = false 2025-09-27 00:01:43.080641 | orchestrator | 00:01:43.080 STDOUT terraform:  + destination_type = "volume" 2025-09-27 00:01:43.080671 | orchestrator | 00:01:43.080 STDOUT terraform:  + multiattach = false 2025-09-27 00:01:43.080701 | orchestrator | 00:01:43.080 STDOUT terraform:  + source_type = "volume" 2025-09-27 00:01:43.080739 | orchestrator | 00:01:43.080 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.080747 | orchestrator | 00:01:43.080 STDOUT terraform:  } 2025-09-27 00:01:43.080753 | orchestrator | 00:01:43.080 STDOUT terraform:  + network { 2025-09-27 00:01:43.080777 | orchestrator | 00:01:43.080 STDOUT terraform:  + access_network = false 2025-09-27 00:01:43.080808 | orchestrator | 00:01:43.080 STDOUT terraform:  + fixed_ip_v4 = (known after apply) 2025-09-27 00:01:43.080838 | orchestrator | 00:01:43.080 STDOUT terraform:  + fixed_ip_v6 = (known after apply) 2025-09-27 00:01:43.080868 | orchestrator | 00:01:43.080 STDOUT terraform:  + mac = (known after apply) 2025-09-27 00:01:43.080898 | orchestrator | 00:01:43.080 STDOUT terraform:  + name = (known after apply) 2025-09-27 00:01:43.080932 | orchestrator | 00:01:43.080 STDOUT terraform:  + port = (known after apply) 2025-09-27 00:01:43.080959 | orchestrator | 00:01:43.080 STDOUT terraform:  + uuid = (known after apply) 2025-09-27 00:01:43.080966 | orchestrator | 00:01:43.080 STDOUT terraform:  } 2025-09-27 00:01:43.080972 | orchestrator | 00:01:43.080 STDOUT terraform:  } 2025-09-27 00:01:43.081010 | orchestrator | 00:01:43.080 STDOUT terraform:  # openstack_compute_keypair_v2.key will be created 2025-09-27 00:01:43.081044 | orchestrator | 00:01:43.081 STDOUT terraform:  + resource "openstack_compute_keypair_v2" "key" { 2025-09-27 00:01:43.081071 | orchestrator | 00:01:43.081 STDOUT terraform:  + fingerprint = (known after apply) 2025-09-27 00:01:43.081098 | orchestrator | 00:01:43.081 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.081117 | orchestrator | 00:01:43.081 STDOUT terraform:  + name = "testbed" 2025-09-27 00:01:43.081142 | orchestrator | 00:01:43.081 STDOUT terraform:  + private_key = (sensitive value) 2025-09-27 00:01:43.081170 | orchestrator | 00:01:43.081 STDOUT terraform:  + public_key = (known after apply) 2025-09-27 00:01:43.081197 | orchestrator | 00:01:43.081 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.081226 | orchestrator | 00:01:43.081 STDOUT terraform:  + user_id = (known after apply) 2025-09-27 00:01:43.081233 | orchestrator | 00:01:43.081 STDOUT terraform:  } 2025-09-27 00:01:43.081281 | orchestrator | 00:01:43.081 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[0] will be created 2025-09-27 00:01:43.081329 | orchestrator | 00:01:43.081 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.081356 | orchestrator | 00:01:43.081 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.081383 | orchestrator | 00:01:43.081 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.081411 | orchestrator | 00:01:43.081 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.081439 | orchestrator | 00:01:43.081 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.081467 | orchestrator | 00:01:43.081 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.081474 | orchestrator | 00:01:43.081 STDOUT terraform:  } 2025-09-27 00:01:43.081524 | orchestrator | 00:01:43.081 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[1] will be created 2025-09-27 00:01:43.081571 | orchestrator | 00:01:43.081 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.081598 | orchestrator | 00:01:43.081 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.081625 | orchestrator | 00:01:43.081 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.081652 | orchestrator | 00:01:43.081 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.081679 | orchestrator | 00:01:43.081 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.081708 | orchestrator | 00:01:43.081 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.081715 | orchestrator | 00:01:43.081 STDOUT terraform:  } 2025-09-27 00:01:43.081764 | orchestrator | 00:01:43.081 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[2] will be created 2025-09-27 00:01:43.081811 | orchestrator | 00:01:43.081 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.081839 | orchestrator | 00:01:43.081 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.081867 | orchestrator | 00:01:43.081 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.081893 | orchestrator | 00:01:43.081 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.081920 | orchestrator | 00:01:43.081 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.081947 | orchestrator | 00:01:43.081 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.081955 | orchestrator | 00:01:43.081 STDOUT terraform:  } 2025-09-27 00:01:43.082004 | orchestrator | 00:01:43.081 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[3] will be created 2025-09-27 00:01:43.082070 | orchestrator | 00:01:43.081 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.082098 | orchestrator | 00:01:43.082 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.082125 | orchestrator | 00:01:43.082 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.082254 | orchestrator | 00:01:43.082 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.082295 | orchestrator | 00:01:43.082 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.082310 | orchestrator | 00:01:43.082 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.082322 | orchestrator | 00:01:43.082 STDOUT terraform:  } 2025-09-27 00:01:43.082341 | orchestrator | 00:01:43.082 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[4] will be created 2025-09-27 00:01:43.082365 | orchestrator | 00:01:43.082 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.082377 | orchestrator | 00:01:43.082 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.082392 | orchestrator | 00:01:43.082 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.082403 | orchestrator | 00:01:43.082 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.082417 | orchestrator | 00:01:43.082 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.082431 | orchestrator | 00:01:43.082 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.082445 | orchestrator | 00:01:43.082 STDOUT terraform:  } 2025-09-27 00:01:43.082503 | orchestrator | 00:01:43.082 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[5] will be created 2025-09-27 00:01:43.082546 | orchestrator | 00:01:43.082 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.082562 | orchestrator | 00:01:43.082 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.082576 | orchestrator | 00:01:43.082 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.082614 | orchestrator | 00:01:43.082 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.082653 | orchestrator | 00:01:43.082 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.082669 | orchestrator | 00:01:43.082 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.082680 | orchestrator | 00:01:43.082 STDOUT terraform:  } 2025-09-27 00:01:43.082733 | orchestrator | 00:01:43.082 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[6] will be created 2025-09-27 00:01:43.082773 | orchestrator | 00:01:43.082 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.082789 | orchestrator | 00:01:43.082 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.082813 | orchestrator | 00:01:43.082 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.082862 | orchestrator | 00:01:43.082 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.082875 | orchestrator | 00:01:43.082 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.082889 | orchestrator | 00:01:43.082 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.082901 | orchestrator | 00:01:43.082 STDOUT terraform:  } 2025-09-27 00:01:43.082956 | orchestrator | 00:01:43.082 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[7] will be created 2025-09-27 00:01:43.082996 | orchestrator | 00:01:43.082 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.083013 | orchestrator | 00:01:43.082 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.083050 | orchestrator | 00:01:43.083 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.083066 | orchestrator | 00:01:43.083 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.083081 | orchestrator | 00:01:43.083 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.083103 | orchestrator | 00:01:43.083 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.083117 | orchestrator | 00:01:43.083 STDOUT terraform:  } 2025-09-27 00:01:43.083187 | orchestrator | 00:01:43.083 STDOUT terraform:  # openstack_compute_volume_attach_v2.node_volume_attachment[8] will be created 2025-09-27 00:01:43.083228 | orchestrator | 00:01:43.083 STDOUT terraform:  + resource "openstack_compute_volume_attach_v2" "node_volume_attachment" { 2025-09-27 00:01:43.083255 | orchestrator | 00:01:43.083 STDOUT terraform:  + device = (known after apply) 2025-09-27 00:01:43.083269 | orchestrator | 00:01:43.083 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.083283 | orchestrator | 00:01:43.083 STDOUT terraform:  + instance_id = (known after apply) 2025-09-27 00:01:43.083322 | orchestrator | 00:01:43.083 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.083338 | orchestrator | 00:01:43.083 STDOUT terraform:  + volume_id = (known after apply) 2025-09-27 00:01:43.083350 | orchestrator | 00:01:43.083 STDOUT terraform:  } 2025-09-27 00:01:43.083415 | orchestrator | 00:01:43.083 STDOUT terraform:  # openstack_networking_floatingip_associate_v2.manager_floating_ip_association will be created 2025-09-27 00:01:43.083459 | orchestrator | 00:01:43.083 STDOUT terraform:  + resource "openstack_networking_floatingip_associate_v2" "manager_floating_ip_association" { 2025-09-27 00:01:43.083475 | orchestrator | 00:01:43.083 STDOUT terraform:  + fixed_ip = (known after apply) 2025-09-27 00:01:43.083522 | orchestrator | 00:01:43.083 STDOUT terraform:  + floating_ip = (known after apply) 2025-09-27 00:01:43.083538 | orchestrator | 00:01:43.083 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.083588 | orchestrator | 00:01:43.083 STDOUT terraform:  + port_id = (known after apply) 2025-09-27 00:01:43.083601 | orchestrator | 00:01:43.083 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.083615 | orchestrator | 00:01:43.083 STDOUT terraform:  } 2025-09-27 00:01:43.083653 | orchestrator | 00:01:43.083 STDOUT terraform:  # openstack_networking_floatingip_v2.manager_floating_ip will be created 2025-09-27 00:01:43.083692 | orchestrator | 00:01:43.083 STDOUT terraform:  + resource "openstack_networking_floatingip_v2" "manager_floating_ip" { 2025-09-27 00:01:43.083708 | orchestrator | 00:01:43.083 STDOUT terraform:  + address = (known after apply) 2025-09-27 00:01:43.083722 | orchestrator | 00:01:43.083 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.083749 | orchestrator | 00:01:43.083 STDOUT terraform:  + dns_domain = (known after apply) 2025-09-27 00:01:43.083765 | orchestrator | 00:01:43.083 STDOUT terraform:  + dns_name = (known after apply) 2025-09-27 00:01:43.083803 | orchestrator | 00:01:43.083 STDOUT terraform:  + fixed_ip = (known after apply) 2025-09-27 00:01:43.083819 | orchestrator | 00:01:43.083 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.083833 | orchestrator | 00:01:43.083 STDOUT terraform:  + pool = "public" 2025-09-27 00:01:43.083848 | orchestrator | 00:01:43.083 STDOUT terraform:  + port_id = (known after apply) 2025-09-27 00:01:43.083862 | orchestrator | 00:01:43.083 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.083899 | orchestrator | 00:01:43.083 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-27 00:01:43.083924 | orchestrator | 00:01:43.083 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.083935 | orchestrator | 00:01:43.083 STDOUT terraform:  } 2025-09-27 00:01:43.083984 | orchestrator | 00:01:43.083 STDOUT terraform:  # openstack_networking_network_v2.net_management will be created 2025-09-27 00:01:43.084001 | orchestrator | 00:01:43.083 STDOUT terraform:  + resource "openstack_networking_network_v2" "net_management" { 2025-09-27 00:01:43.084039 | orchestrator | 00:01:43.083 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.084089 | orchestrator | 00:01:43.084 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.084102 | orchestrator | 00:01:43.084 STDOUT terraform:  + availability_zone_hints = [ 2025-09-27 00:01:43.084117 | orchestrator | 00:01:43.084 STDOUT terraform:  + "nova", 2025-09-27 00:01:43.084150 | orchestrator | 00:01:43.084 STDOUT terraform:  ] 2025-09-27 00:01:43.084166 | orchestrator | 00:01:43.084 STDOUT terraform:  + dns_domain = (known after apply) 2025-09-27 00:01:43.084180 | orchestrator | 00:01:43.084 STDOUT terraform:  + external = (known after apply) 2025-09-27 00:01:43.084220 | orchestrator | 00:01:43.084 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.084237 | orchestrator | 00:01:43.084 STDOUT terraform:  + mtu = (known after apply) 2025-09-27 00:01:43.084291 | orchestrator | 00:01:43.084 STDOUT terraform:  + name = "net-testbed-management" 2025-09-27 00:01:43.084308 | orchestrator | 00:01:43.084 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-27 00:01:43.084356 | orchestrator | 00:01:43.084 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-27 00:01:43.084373 | orchestrator | 00:01:43.084 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.084430 | orchestrator | 00:01:43.084 STDOUT terraform:  + shared = (known after apply) 2025-09-27 00:01:43.084448 | orchestrator | 00:01:43.084 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.084487 | orchestrator | 00:01:43.084 STDOUT terraform:  + transparent_vlan = (known after apply) 2025-09-27 00:01:43.084504 | orchestrator | 00:01:43.084 STDOUT terraform:  + segments (known after apply) 2025-09-27 00:01:43.084515 | orchestrator | 00:01:43.084 STDOUT terraform:  } 2025-09-27 00:01:43.084554 | orchestrator | 00:01:43.084 STDOUT terraform:  # openstack_networking_port_v2.manager_port_management will be created 2025-09-27 00:01:43.084662 | orchestrator | 00:01:43.084 STDOUT terraform:  + resource "openstack_networking_port_v2" "manager_port_management" { 2025-09-27 00:01:43.084715 | orchestrator | 00:01:43.084 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.084731 | orchestrator | 00:01:43.084 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-27 00:01:43.084754 | orchestrator | 00:01:43.084 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-27 00:01:43.084803 | orchestrator | 00:01:43.084 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.084820 | orchestrator | 00:01:43.084 STDOUT terraform:  + device_id = (known after apply) 2025-09-27 00:01:43.084860 | orchestrator | 00:01:43.084 STDOUT terraform:  + device_owner = (known after apply) 2025-09-27 00:01:43.084902 | orchestrator | 00:01:43.084 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-27 00:01:43.084918 | orchestrator | 00:01:43.084 STDOUT terraform:  + dns_name = (known after apply) 2025-09-27 00:01:43.084970 | orchestrator | 00:01:43.084 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.084987 | orchestrator | 00:01:43.084 STDOUT terraform:  + mac_address = (known after apply) 2025-09-27 00:01:43.085038 | orchestrator | 00:01:43.084 STDOUT terraform:  + network_id = (known after apply) 2025-09-27 00:01:43.085054 | orchestrator | 00:01:43.085 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-27 00:01:43.085104 | orchestrator | 00:01:43.085 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-27 00:01:43.085120 | orchestrator | 00:01:43.085 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.085174 | orchestrator | 00:01:43.085 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-27 00:01:43.085191 | orchestrator | 00:01:43.085 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.085205 | orchestrator | 00:01:43.085 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.085245 | orchestrator | 00:01:43.085 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-27 00:01:43.085257 | orchestrator | 00:01:43.085 STDOUT terraform:  } 2025-09-27 00:01:43.085271 | orchestrator | 00:01:43.085 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.085286 | orchestrator | 00:01:43.085 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-27 00:01:43.085297 | orchestrator | 00:01:43.085 STDOUT terraform:  } 2025-09-27 00:01:43.085311 | orchestrator | 00:01:43.085 STDOUT terraform:  + binding (known after apply) 2025-09-27 00:01:43.085325 | orchestrator | 00:01:43.085 STDOUT terraform:  + fixed_ip { 2025-09-27 00:01:43.085339 | orchestrator | 00:01:43.085 STDOUT terraform:  + ip_address = "192.168.16.5" 2025-09-27 00:01:43.085378 | orchestrator | 00:01:43.085 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-27 00:01:43.085390 | orchestrator | 00:01:43.085 STDOUT terraform:  } 2025-09-27 00:01:43.085404 | orchestrator | 00:01:43.085 STDOUT terraform:  } 2025-09-27 00:01:43.085419 | orchestrator | 00:01:43.085 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[0] will be created 2025-09-27 00:01:43.085480 | orchestrator | 00:01:43.085 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-27 00:01:43.085497 | orchestrator | 00:01:43.085 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.085546 | orchestrator | 00:01:43.085 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-27 00:01:43.085563 | orchestrator | 00:01:43.085 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-27 00:01:43.085611 | orchestrator | 00:01:43.085 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.085627 | orchestrator | 00:01:43.085 STDOUT terraform:  + device_id = (known after apply) 2025-09-27 00:01:43.085675 | orchestrator | 00:01:43.085 STDOUT terraform:  + device_owner = (known after apply) 2025-09-27 00:01:43.085692 | orchestrator | 00:01:43.085 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-27 00:01:43.085741 | orchestrator | 00:01:43.085 STDOUT terraform:  + dns_name = (known after apply) 2025-09-27 00:01:43.085758 | orchestrator | 00:01:43.085 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.085813 | orchestrator | 00:01:43.085 STDOUT terraform:  + mac_address = (known after apply) 2025-09-27 00:01:43.085830 | orchestrator | 00:01:43.085 STDOUT terraform:  + network_id = (known after apply) 2025-09-27 00:01:43.085870 | orchestrator | 00:01:43.085 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-27 00:01:43.085886 | orchestrator | 00:01:43.085 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-27 00:01:43.085937 | orchestrator | 00:01:43.085 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.085953 | orchestrator | 00:01:43.085 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-27 00:01:43.086003 | orchestrator | 00:01:43.085 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.086043 | orchestrator | 00:01:43.085 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.086084 | orchestrator | 00:01:43.086 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-27 00:01:43.086096 | orchestrator | 00:01:43.086 STDOUT terraform:  } 2025-09-27 00:01:43.086111 | orchestrator | 00:01:43.086 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.086125 | orchestrator | 00:01:43.086 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-27 00:01:43.086152 | orchestrator | 00:01:43.086 STDOUT terraform:  } 2025-09-27 00:01:43.086167 | orchestrator | 00:01:43.086 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.086193 | orchestrator | 00:01:43.086 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-27 00:01:43.086209 | orchestrator | 00:01:43.086 STDOUT terraform:  } 2025-09-27 00:01:43.086223 | orchestrator | 00:01:43.086 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.086238 | orchestrator | 00:01:43.086 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-27 00:01:43.086252 | orchestrator | 00:01:43.086 STDOUT terraform:  } 2025-09-27 00:01:43.086279 | orchestrator | 00:01:43.086 STDOUT terraform:  + binding (known after apply) 2025-09-27 00:01:43.086294 | orchestrator | 00:01:43.086 STDOUT terraform:  + fixed_ip { 2025-09-27 00:01:43.086308 | orchestrator | 00:01:43.086 STDOUT terraform:  + ip_address = "192.168.16.10" 2025-09-27 00:01:43.086346 | orchestrator | 00:01:43.086 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-27 00:01:43.086359 | orchestrator | 00:01:43.086 STDOUT terraform:  } 2025-09-27 00:01:43.086373 | orchestrator | 00:01:43.086 STDOUT terraform:  } 2025-09-27 00:01:43.086409 | orchestrator | 00:01:43.086 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[1] will be created 2025-09-27 00:01:43.086455 | orchestrator | 00:01:43.086 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-27 00:01:43.086478 | orchestrator | 00:01:43.086 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.086524 | orchestrator | 00:01:43.086 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-27 00:01:43.086564 | orchestrator | 00:01:43.086 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-27 00:01:43.086585 | orchestrator | 00:01:43.086 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.086626 | orchestrator | 00:01:43.086 STDOUT terraform:  + device_id = (known after apply) 2025-09-27 00:01:43.086666 | orchestrator | 00:01:43.086 STDOUT terraform:  + device_owner = (known after apply) 2025-09-27 00:01:43.086682 | orchestrator | 00:01:43.086 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-27 00:01:43.086727 | orchestrator | 00:01:43.086 STDOUT terraform:  + dns_name = (known after apply) 2025-09-27 00:01:43.086767 | orchestrator | 00:01:43.086 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.086783 | orchestrator | 00:01:43.086 STDOUT terraform:  + mac_address = (known after apply) 2025-09-27 00:01:43.086830 | orchestrator | 00:01:43.086 STDOUT terraform:  + network_id = (known after apply) 2025-09-27 00:01:43.086856 | orchestrator | 00:01:43.086 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-27 00:01:43.086896 | orchestrator | 00:01:43.086 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-27 00:01:43.086924 | orchestrator | 00:01:43.086 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.086963 | orchestrator | 00:01:43.086 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-27 00:01:43.086990 | orchestrator | 00:01:43.086 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.087006 | orchestrator | 00:01:43.086 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.087030 | orchestrator | 00:01:43.086 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-27 00:01:43.087046 | orchestrator | 00:01:43.087 STDOUT terraform:  } 2025-09-27 00:01:43.087060 | orchestrator | 00:01:43.087 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.087075 | orchestrator | 00:01:43.087 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-27 00:01:43.087089 | orchestrator | 00:01:43.087 STDOUT terraform:  } 2025-09-27 00:01:43.087103 | orchestrator | 00:01:43.087 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.087144 | orchestrator | 00:01:43.087 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-27 00:01:43.087161 | orchestrator | 00:01:43.087 STDOUT terraform:  } 2025-09-27 00:01:43.087172 | orchestrator | 00:01:43.087 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.087186 | orchestrator | 00:01:43.087 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-27 00:01:43.087197 | orchestrator | 00:01:43.087 STDOUT terraform:  } 2025-09-27 00:01:43.087211 | orchestrator | 00:01:43.087 STDOUT terraform:  + binding (known after apply) 2025-09-27 00:01:43.087226 | orchestrator | 00:01:43.087 STDOUT terraform:  + fixed_ip { 2025-09-27 00:01:43.087247 | orchestrator | 00:01:43.087 STDOUT terraform:  + ip_address = "192.168.16.11" 2025-09-27 00:01:43.087258 | orchestrator | 00:01:43.087 STDOUT terraform:  + su 2025-09-27 00:01:43.087316 | orchestrator | 00:01:43.087 STDOUT terraform: bnet_id = (known after apply) 2025-09-27 00:01:43.087333 | orchestrator | 00:01:43.087 STDOUT terraform:  } 2025-09-27 00:01:43.087344 | orchestrator | 00:01:43.087 STDOUT terraform:  } 2025-09-27 00:01:43.087380 | orchestrator | 00:01:43.087 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[2] will be created 2025-09-27 00:01:43.087424 | orchestrator | 00:01:43.087 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-27 00:01:43.087463 | orchestrator | 00:01:43.087 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.087482 | orchestrator | 00:01:43.087 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-27 00:01:43.087527 | orchestrator | 00:01:43.087 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-27 00:01:43.087554 | orchestrator | 00:01:43.087 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.087595 | orchestrator | 00:01:43.087 STDOUT terraform:  + device_id = (known after apply) 2025-09-27 00:01:43.087623 | orchestrator | 00:01:43.087 STDOUT terraform:  + device_owner = (known after apply) 2025-09-27 00:01:43.087663 | orchestrator | 00:01:43.087 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-27 00:01:43.087691 | orchestrator | 00:01:43.087 STDOUT terraform:  + dns_name = (known after apply) 2025-09-27 00:01:43.087732 | orchestrator | 00:01:43.087 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.087760 | orchestrator | 00:01:43.087 STDOUT terraform:  + mac_address = (known after apply) 2025-09-27 00:01:43.087799 | orchestrator | 00:01:43.087 STDOUT terraform:  + network_id = (known after apply) 2025-09-27 00:01:43.087826 | orchestrator | 00:01:43.087 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-27 00:01:43.087866 | orchestrator | 00:01:43.087 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-27 00:01:43.087894 | orchestrator | 00:01:43.087 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.087933 | orchestrator | 00:01:43.087 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-27 00:01:43.087960 | orchestrator | 00:01:43.087 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.087975 | orchestrator | 00:01:43.087 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.088011 | orchestrator | 00:01:43.087 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-27 00:01:43.088025 | orchestrator | 00:01:43.087 STDOUT terraform:  } 2025-09-27 00:01:43.088039 | orchestrator | 00:01:43.088 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.088076 | orchestrator | 00:01:43.088 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-27 00:01:43.088089 | orchestrator | 00:01:43.088 STDOUT terraform:  } 2025-09-27 00:01:43.088104 | orchestrator | 00:01:43.088 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.088125 | orchestrator | 00:01:43.088 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-27 00:01:43.088163 | orchestrator | 00:01:43.088 STDOUT terraform:  } 2025-09-27 00:01:43.088178 | orchestrator | 00:01:43.088 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.088189 | orchestrator | 00:01:43.088 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-27 00:01:43.088200 | orchestrator | 00:01:43.088 STDOUT terraform:  } 2025-09-27 00:01:43.088214 | orchestrator | 00:01:43.088 STDOUT terraform:  + binding (known after apply) 2025-09-27 00:01:43.088225 | orchestrator | 00:01:43.088 STDOUT terraform:  + fixed_ip { 2025-09-27 00:01:43.088239 | orchestrator | 00:01:43.088 STDOUT terraform:  + ip_address = "192.168.16.12" 2025-09-27 00:01:43.088253 | orchestrator | 00:01:43.088 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-27 00:01:43.088267 | orchestrator | 00:01:43.088 STDOUT terraform:  } 2025-09-27 00:01:43.088278 | orchestrator | 00:01:43.088 STDOUT terraform:  } 2025-09-27 00:01:43.088418 | orchestrator | 00:01:43.088 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[3] will be created 2025-09-27 00:01:43.088442 | orchestrator | 00:01:43.088 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-27 00:01:43.088466 | orchestrator | 00:01:43.088 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.088504 | orchestrator | 00:01:43.088 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-27 00:01:43.088541 | orchestrator | 00:01:43.088 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-27 00:01:43.088576 | orchestrator | 00:01:43.088 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.088611 | orchestrator | 00:01:43.088 STDOUT terraform:  + device_id = (known after apply) 2025-09-27 00:01:43.088648 | orchestrator | 00:01:43.088 STDOUT terraform:  + device_owner = (known after apply) 2025-09-27 00:01:43.088690 | orchestrator | 00:01:43.088 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-27 00:01:43.088719 | orchestrator | 00:01:43.088 STDOUT terraform:  + dns_name = (known after apply) 2025-09-27 00:01:43.088757 | orchestrator | 00:01:43.088 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.088789 | orchestrator | 00:01:43.088 STDOUT terraform:  + mac_address = (known after apply) 2025-09-27 00:01:43.088824 | orchestrator | 00:01:43.088 STDOUT terraform:  + network_id = (known after apply) 2025-09-27 00:01:43.088858 | orchestrator | 00:01:43.088 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-27 00:01:43.088893 | orchestrator | 00:01:43.088 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-27 00:01:43.088928 | orchestrator | 00:01:43.088 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.088962 | orchestrator | 00:01:43.088 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-27 00:01:43.088996 | orchestrator | 00:01:43.088 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.089023 | orchestrator | 00:01:43.088 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.089037 | orchestrator | 00:01:43.089 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-27 00:01:43.089044 | orchestrator | 00:01:43.089 STDOUT terraform:  } 2025-09-27 00:01:43.089064 | orchestrator | 00:01:43.089 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.089093 | orchestrator | 00:01:43.089 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-27 00:01:43.089100 | orchestrator | 00:01:43.089 STDOUT terraform:  } 2025-09-27 00:01:43.089119 | orchestrator | 00:01:43.089 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.089152 | orchestrator | 00:01:43.089 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-27 00:01:43.089158 | orchestrator | 00:01:43.089 STDOUT terraform:  } 2025-09-27 00:01:43.089165 | orchestrator | 00:01:43.089 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.089198 | orchestrator | 00:01:43.089 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-27 00:01:43.089207 | orchestrator | 00:01:43.089 STDOUT terraform:  } 2025-09-27 00:01:43.089233 | orchestrator | 00:01:43.089 STDOUT terraform:  + binding (known after apply) 2025-09-27 00:01:43.089240 | orchestrator | 00:01:43.089 STDOUT terraform:  + fixed_ip { 2025-09-27 00:01:43.089266 | orchestrator | 00:01:43.089 STDOUT terraform:  + ip_address = "192.168.16.13" 2025-09-27 00:01:43.089294 | orchestrator | 00:01:43.089 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-27 00:01:43.089302 | orchestrator | 00:01:43.089 STDOUT terraform:  } 2025-09-27 00:01:43.089308 | orchestrator | 00:01:43.089 STDOUT terraform:  } 2025-09-27 00:01:43.089358 | orchestrator | 00:01:43.089 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[4] will be created 2025-09-27 00:01:43.089401 | orchestrator | 00:01:43.089 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-27 00:01:43.089435 | orchestrator | 00:01:43.089 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.089472 | orchestrator | 00:01:43.089 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-27 00:01:43.089506 | orchestrator | 00:01:43.089 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-27 00:01:43.089540 | orchestrator | 00:01:43.089 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.089575 | orchestrator | 00:01:43.089 STDOUT terraform:  + device_id = (known after apply) 2025-09-27 00:01:43.089609 | orchestrator | 00:01:43.089 STDOUT terraform:  + device_owner = (known after apply) 2025-09-27 00:01:43.089643 | orchestrator | 00:01:43.089 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-27 00:01:43.089678 | orchestrator | 00:01:43.089 STDOUT terraform:  + dns_name = (known after apply) 2025-09-27 00:01:43.089714 | orchestrator | 00:01:43.089 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.089749 | orchestrator | 00:01:43.089 STDOUT terraform:  + mac_address = (known after apply) 2025-09-27 00:01:43.089786 | orchestrator | 00:01:43.089 STDOUT terraform:  + network_id = (known after apply) 2025-09-27 00:01:43.089817 | orchestrator | 00:01:43.089 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-27 00:01:43.089852 | orchestrator | 00:01:43.089 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-27 00:01:43.089887 | orchestrator | 00:01:43.089 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.089920 | orchestrator | 00:01:43.089 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-27 00:01:43.089955 | orchestrator | 00:01:43.089 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.089962 | orchestrator | 00:01:43.089 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.089997 | orchestrator | 00:01:43.089 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-27 00:01:43.090004 | orchestrator | 00:01:43.089 STDOUT terraform:  } 2025-09-27 00:01:43.090035 | orchestrator | 00:01:43.089 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.090067 | orchestrator | 00:01:43.090 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-27 00:01:43.090075 | orchestrator | 00:01:43.090 STDOUT terraform:  } 2025-09-27 00:01:43.090094 | orchestrator | 00:01:43.090 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.090119 | orchestrator | 00:01:43.090 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-27 00:01:43.090127 | orchestrator | 00:01:43.090 STDOUT terraform:  } 2025-09-27 00:01:43.090174 | orchestrator | 00:01:43.090 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.090203 | orchestrator | 00:01:43.090 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-27 00:01:43.090210 | orchestrator | 00:01:43.090 STDOUT terraform:  } 2025-09-27 00:01:43.090237 | orchestrator | 00:01:43.090 STDOUT terraform:  + binding (known after apply) 2025-09-27 00:01:43.090245 | orchestrator | 00:01:43.090 STDOUT terraform:  + fixed_ip { 2025-09-27 00:01:43.090272 | orchestrator | 00:01:43.090 STDOUT terraform:  + ip_address = "192.168.16.14" 2025-09-27 00:01:43.090300 | orchestrator | 00:01:43.090 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-27 00:01:43.090308 | orchestrator | 00:01:43.090 STDOUT terraform:  } 2025-09-27 00:01:43.090314 | orchestrator | 00:01:43.090 STDOUT terraform:  } 2025-09-27 00:01:43.090367 | orchestrator | 00:01:43.090 STDOUT terraform:  # openstack_networking_port_v2.node_port_management[5] will be created 2025-09-27 00:01:43.090411 | orchestrator | 00:01:43.090 STDOUT terraform:  + resource "openstack_networking_port_v2" "node_port_management" { 2025-09-27 00:01:43.090445 | orchestrator | 00:01:43.090 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.090479 | orchestrator | 00:01:43.090 STDOUT terraform:  + all_fixed_ips = (known after apply) 2025-09-27 00:01:43.090515 | orchestrator | 00:01:43.090 STDOUT terraform:  + all_security_group_ids = (known after apply) 2025-09-27 00:01:43.090550 | orchestrator | 00:01:43.090 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.090585 | orchestrator | 00:01:43.090 STDOUT terraform:  + device_id = (known after apply) 2025-09-27 00:01:43.090619 | orchestrator | 00:01:43.090 STDOUT terraform:  + device_owner = (known after apply) 2025-09-27 00:01:43.090654 | orchestrator | 00:01:43.090 STDOUT terraform:  + dns_assignment = (known after apply) 2025-09-27 00:01:43.090690 | orchestrator | 00:01:43.090 STDOUT terraform:  + dns_name = (known after apply) 2025-09-27 00:01:43.090725 | orchestrator | 00:01:43.090 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.090760 | orchestrator | 00:01:43.090 STDOUT terraform:  + mac_address = (known after apply) 2025-09-27 00:01:43.090795 | orchestrator | 00:01:43.090 STDOUT terraform:  + network_id = (known after apply) 2025-09-27 00:01:43.090829 | orchestrator | 00:01:43.090 STDOUT terraform:  + port_security_enabled = (known after apply) 2025-09-27 00:01:43.090864 | orchestrator | 00:01:43.090 STDOUT terraform:  + qos_policy_id = (known after apply) 2025-09-27 00:01:43.090903 | orchestrator | 00:01:43.090 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.090935 | orchestrator | 00:01:43.090 STDOUT terraform:  + security_group_ids = (known after apply) 2025-09-27 00:01:43.090970 | orchestrator | 00:01:43.090 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.090989 | orchestrator | 00:01:43.090 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.091016 | orchestrator | 00:01:43.090 STDOUT terraform:  + ip_address = "192.168.112.0/20" 2025-09-27 00:01:43.091024 | orchestrator | 00:01:43.091 STDOUT terraform:  } 2025-09-27 00:01:43.091042 | orchestrator | 00:01:43.091 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.091069 | orchestrator | 00:01:43.091 STDOUT terraform:  + ip_address = "192.168.16.254/20" 2025-09-27 00:01:43.091077 | orchestrator | 00:01:43.091 STDOUT terraform:  } 2025-09-27 00:01:43.091095 | orchestrator | 00:01:43.091 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.091122 | orchestrator | 00:01:43.091 STDOUT terraform:  + ip_address = "192.168.16.8/20" 2025-09-27 00:01:43.091139 | orchestrator | 00:01:43.091 STDOUT terraform:  } 2025-09-27 00:01:43.091165 | orchestrator | 00:01:43.091 STDOUT terraform:  + allowed_address_pairs { 2025-09-27 00:01:43.091184 | orchestrator | 00:01:43.091 STDOUT terraform:  + ip_address = "192.168.16.9/20" 2025-09-27 00:01:43.091191 | orchestrator | 00:01:43.091 STDOUT terraform:  } 2025-09-27 00:01:43.091217 | orchestrator | 00:01:43.091 STDOUT terraform:  + binding (known after apply) 2025-09-27 00:01:43.091224 | orchestrator | 00:01:43.091 STDOUT terraform:  + fixed_ip { 2025-09-27 00:01:43.091252 | orchestrator | 00:01:43.091 STDOUT terraform:  + ip_address = "192.168.16.15" 2025-09-27 00:01:43.091281 | orchestrator | 00:01:43.091 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-27 00:01:43.091289 | orchestrator | 00:01:43.091 STDOUT terraform:  } 2025-09-27 00:01:43.091295 | orchestrator | 00:01:43.091 STDOUT terraform:  } 2025-09-27 00:01:43.091344 | orchestrator | 00:01:43.091 STDOUT terraform:  # openstack_networking_router_interface_v2.router_interface will be created 2025-09-27 00:01:43.091389 | orchestrator | 00:01:43.091 STDOUT terraform:  + resource "openstack_networking_router_interface_v2" "router_interface" { 2025-09-27 00:01:43.091415 | orchestrator | 00:01:43.091 STDOUT terraform:  + force_destroy = false 2025-09-27 00:01:43.091434 | orchestrator | 00:01:43.091 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.091464 | orchestrator | 00:01:43.091 STDOUT terraform:  + port_id = (known after apply) 2025-09-27 00:01:43.091493 | orchestrator | 00:01:43.091 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.091522 | orchestrator | 00:01:43.091 STDOUT terraform:  + router_id = (known after apply) 2025-09-27 00:01:43.091550 | orchestrator | 00:01:43.091 STDOUT terraform:  + subnet_id = (known after apply) 2025-09-27 00:01:43.091558 | orchestrator | 00:01:43.091 STDOUT terraform:  } 2025-09-27 00:01:43.091594 | orchestrator | 00:01:43.091 STDOUT terraform:  # openstack_networking_router_v2.router will be created 2025-09-27 00:01:43.091629 | orchestrator | 00:01:43.091 STDOUT terraform:  + resource "openstack_networking_router_v2" "router" { 2025-09-27 00:01:43.091665 | orchestrator | 00:01:43.091 STDOUT terraform:  + admin_state_up = (known after apply) 2025-09-27 00:01:43.091701 | orchestrator | 00:01:43.091 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.091722 | orchestrator | 00:01:43.091 STDOUT terraform:  + availability_zone_hints = [ 2025-09-27 00:01:43.091729 | orchestrator | 00:01:43.091 STDOUT terraform:  + "nova", 2025-09-27 00:01:43.091735 | orchestrator | 00:01:43.091 STDOUT terraform:  ] 2025-09-27 00:01:43.091774 | orchestrator | 00:01:43.091 STDOUT terraform:  + distributed = (known after apply) 2025-09-27 00:01:43.091809 | orchestrator | 00:01:43.091 STDOUT terraform:  + enable_snat = (known after apply) 2025-09-27 00:01:43.091857 | orchestrator | 00:01:43.091 STDOUT terraform:  + external_network_id = "e6be7364-bfd8-4de7-8120-8f41c69a139a" 2025-09-27 00:01:43.091896 | orchestrator | 00:01:43.091 STDOUT terraform:  + external_qos_policy_id = (known after apply) 2025-09-27 00:01:43.091926 | orchestrator | 00:01:43.091 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.091954 | orchestrator | 00:01:43.091 STDOUT terraform:  + name = "testbed" 2025-09-27 00:01:43.091989 | orchestrator | 00:01:43.091 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.092024 | orchestrator | 00:01:43.091 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.092053 | orchestrator | 00:01:43.092 STDOUT terraform:  + external_fixed_ip (known after apply) 2025-09-27 00:01:43.092060 | orchestrator | 00:01:43.092 STDOUT terraform:  } 2025-09-27 00:01:43.092113 | orchestrator | 00:01:43.092 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule1 will be created 2025-09-27 00:01:43.092253 | orchestrator | 00:01:43.092 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule1" { 2025-09-27 00:01:43.092301 | orchestrator | 00:01:43.092 STDOUT terraform:  + description = "ssh" 2025-09-27 00:01:43.092318 | orchestrator | 00:01:43.092 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.092337 | orchestrator | 00:01:43.092 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.092349 | orchestrator | 00:01:43.092 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.092360 | orchestrator | 00:01:43.092 STDOUT terraform:  + port_range_max = 22 2025-09-27 00:01:43.092383 | orchestrator | 00:01:43.092 STDOUT terraform:  + port_range_min = 22 2025-09-27 00:01:43.092393 | orchestrator | 00:01:43.092 STDOUT terraform:  + protocol = "tcp" 2025-09-27 00:01:43.092408 | orchestrator | 00:01:43.092 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.092419 | orchestrator | 00:01:43.092 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.092433 | orchestrator | 00:01:43.092 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.092447 | orchestrator | 00:01:43.092 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-27 00:01:43.092473 | orchestrator | 00:01:43.092 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.092516 | orchestrator | 00:01:43.092 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.092530 | orchestrator | 00:01:43.092 STDOUT terraform:  } 2025-09-27 00:01:43.092581 | orchestrator | 00:01:43.092 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule2 will be created 2025-09-27 00:01:43.092633 | orchestrator | 00:01:43.092 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule2" { 2025-09-27 00:01:43.092650 | orchestrator | 00:01:43.092 STDOUT terraform:  + description = "wireguard" 2025-09-27 00:01:43.092664 | orchestrator | 00:01:43.092 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.092688 | orchestrator | 00:01:43.092 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.092738 | orchestrator | 00:01:43.092 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.092755 | orchestrator | 00:01:43.092 STDOUT terraform:  + port_range_max = 51820 2025-09-27 00:01:43.092769 | orchestrator | 00:01:43.092 STDOUT terraform:  + port_range_min = 51820 2025-09-27 00:01:43.092783 | orchestrator | 00:01:43.092 STDOUT terraform:  + protocol = "udp" 2025-09-27 00:01:43.092833 | orchestrator | 00:01:43.092 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.092850 | orchestrator | 00:01:43.092 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.092899 | orchestrator | 00:01:43.092 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.092915 | orchestrator | 00:01:43.092 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-27 00:01:43.092960 | orchestrator | 00:01:43.092 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.092977 | orchestrator | 00:01:43.092 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.092991 | orchestrator | 00:01:43.092 STDOUT terraform:  } 2025-09-27 00:01:43.093047 | orchestrator | 00:01:43.092 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule3 will be created 2025-09-27 00:01:43.093100 | orchestrator | 00:01:43.093 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule3" { 2025-09-27 00:01:43.093117 | orchestrator | 00:01:43.093 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.093163 | orchestrator | 00:01:43.093 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.093188 | orchestrator | 00:01:43.093 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.093202 | orchestrator | 00:01:43.093 STDOUT terraform:  + protocol = "tcp" 2025-09-27 00:01:43.093243 | orchestrator | 00:01:43.093 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.093259 | orchestrator | 00:01:43.093 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.093362 | orchestrator | 00:01:43.093 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.093381 | orchestrator | 00:01:43.093 STDOUT terraform:  + remote_ip_prefix = "192.168.16.0/20" 2025-09-27 00:01:43.093393 | orchestrator | 00:01:43.093 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.093402 | orchestrator | 00:01:43.093 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.093412 | orchestrator | 00:01:43.093 STDOUT terraform:  } 2025-09-27 00:01:43.093466 | orchestrator | 00:01:43.093 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule4 will be created 2025-09-27 00:01:43.093515 | orchestrator | 00:01:43.093 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule4" { 2025-09-27 00:01:43.093543 | orchestrator | 00:01:43.093 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.093563 | orchestrator | 00:01:43.093 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.093598 | orchestrator | 00:01:43.093 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.093622 | orchestrator | 00:01:43.093 STDOUT terraform:  + protocol = "udp" 2025-09-27 00:01:43.093657 | orchestrator | 00:01:43.093 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.093692 | orchestrator | 00:01:43.093 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.093740 | orchestrator | 00:01:43.093 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.093774 | orchestrator | 00:01:43.093 STDOUT terraform:  + remote_ip_prefix = "192.168.16.0/20" 2025-09-27 00:01:43.093808 | orchestrator | 00:01:43.093 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.093844 | orchestrator | 00:01:43.093 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.093851 | orchestrator | 00:01:43.093 STDOUT terraform:  } 2025-09-27 00:01:43.093905 | orchestrator | 00:01:43.093 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_management_rule5 will be created 2025-09-27 00:01:43.093956 | orchestrator | 00:01:43.093 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_management_rule5" { 2025-09-27 00:01:43.093985 | orchestrator | 00:01:43.093 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.094004 | orchestrator | 00:01:43.093 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.094054 | orchestrator | 00:01:43.093 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.094081 | orchestrator | 00:01:43.094 STDOUT terraform:  + protocol = "icmp" 2025-09-27 00:01:43.094121 | orchestrator | 00:01:43.094 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.094158 | orchestrator | 00:01:43.094 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.094195 | orchestrator | 00:01:43.094 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.094227 | orchestrator | 00:01:43.094 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-27 00:01:43.094263 | orchestrator | 00:01:43.094 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.094300 | orchestrator | 00:01:43.094 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.094307 | orchestrator | 00:01:43.094 STDOUT terraform:  } 2025-09-27 00:01:43.094359 | orchestrator | 00:01:43.094 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule1 will be created 2025-09-27 00:01:43.094414 | orchestrator | 00:01:43.094 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule1" { 2025-09-27 00:01:43.094442 | orchestrator | 00:01:43.094 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.094468 | orchestrator | 00:01:43.094 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.094503 | orchestrator | 00:01:43.094 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.094527 | orchestrator | 00:01:43.094 STDOUT terraform:  + protocol = "tcp" 2025-09-27 00:01:43.094562 | orchestrator | 00:01:43.094 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.094597 | orchestrator | 00:01:43.094 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.094632 | orchestrator | 00:01:43.094 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.094661 | orchestrator | 00:01:43.094 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-27 00:01:43.094695 | orchestrator | 00:01:43.094 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.094731 | orchestrator | 00:01:43.094 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.094737 | orchestrator | 00:01:43.094 STDOUT terraform:  } 2025-09-27 00:01:43.094788 | orchestrator | 00:01:43.094 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule2 will be created 2025-09-27 00:01:43.094837 | orchestrator | 00:01:43.094 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule2" { 2025-09-27 00:01:43.094865 | orchestrator | 00:01:43.094 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.094889 | orchestrator | 00:01:43.094 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.094926 | orchestrator | 00:01:43.094 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.094951 | orchestrator | 00:01:43.094 STDOUT terraform:  + protocol = "udp" 2025-09-27 00:01:43.094987 | orchestrator | 00:01:43.094 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.095022 | orchestrator | 00:01:43.094 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.095058 | orchestrator | 00:01:43.095 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.095086 | orchestrator | 00:01:43.095 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-27 00:01:43.095122 | orchestrator | 00:01:43.095 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.095164 | orchestrator | 00:01:43.095 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.095171 | orchestrator | 00:01:43.095 STDOUT terraform:  } 2025-09-27 00:01:43.095222 | orchestrator | 00:01:43.095 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_node_rule3 will be created 2025-09-27 00:01:43.095271 | orchestrator | 00:01:43.095 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_node_rule3" { 2025-09-27 00:01:43.095298 | orchestrator | 00:01:43.095 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.095322 | orchestrator | 00:01:43.095 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.095358 | orchestrator | 00:01:43.095 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.095383 | orchestrator | 00:01:43.095 STDOUT terraform:  + protocol = "icmp" 2025-09-27 00:01:43.095418 | orchestrator | 00:01:43.095 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.095452 | orchestrator | 00:01:43.095 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.095488 | orchestrator | 00:01:43.095 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.095517 | orchestrator | 00:01:43.095 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-27 00:01:43.095552 | orchestrator | 00:01:43.095 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.095586 | orchestrator | 00:01:43.095 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.095593 | orchestrator | 00:01:43.095 STDOUT terraform:  } 2025-09-27 00:01:43.095644 | orchestrator | 00:01:43.095 STDOUT terraform:  # openstack_networking_secgroup_rule_v2.security_group_rule_vrrp will be created 2025-09-27 00:01:43.095693 | orchestrator | 00:01:43.095 STDOUT terraform:  + resource "openstack_networking_secgroup_rule_v2" "security_group_rule_vrrp" { 2025-09-27 00:01:43.095717 | orchestrator | 00:01:43.095 STDOUT terraform:  + description = "vrrp" 2025-09-27 00:01:43.095746 | orchestrator | 00:01:43.095 STDOUT terraform:  + direction = "ingress" 2025-09-27 00:01:43.095771 | orchestrator | 00:01:43.095 STDOUT terraform:  + ethertype = "IPv4" 2025-09-27 00:01:43.095827 | orchestrator | 00:01:43.095 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.095835 | orchestrator | 00:01:43.095 STDOUT terraform:  + protocol = "112" 2025-09-27 00:01:43.095863 | orchestrator | 00:01:43.095 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.095897 | orchestrator | 00:01:43.095 STDOUT terraform:  + remote_address_group_id = (known after apply) 2025-09-27 00:01:43.095932 | orchestrator | 00:01:43.095 STDOUT terraform:  + remote_group_id = (known after apply) 2025-09-27 00:01:43.095960 | orchestrator | 00:01:43.095 STDOUT terraform:  + remote_ip_prefix = "0.0.0.0/0" 2025-09-27 00:01:43.095995 | orchestrator | 00:01:43.095 STDOUT terraform:  + security_group_id = (known after apply) 2025-09-27 00:01:43.096030 | orchestrator | 00:01:43.095 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.096037 | orchestrator | 00:01:43.096 STDOUT terraform:  } 2025-09-27 00:01:43.096086 | orchestrator | 00:01:43.096 STDOUT terraform:  # openstack_networking_secgroup_v2.security_group_management will be created 2025-09-27 00:01:43.096140 | orchestrator | 00:01:43.096 STDOUT terraform:  + resource "openstack_networking_secgroup_v2" "security_group_management" { 2025-09-27 00:01:43.096168 | orchestrator | 00:01:43.096 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.096200 | orchestrator | 00:01:43.096 STDOUT terraform:  + description = "management security group" 2025-09-27 00:01:43.096227 | orchestrator | 00:01:43.096 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.096255 | orchestrator | 00:01:43.096 STDOUT terraform:  + name = "testbed-management" 2025-09-27 00:01:43.096281 | orchestrator | 00:01:43.096 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.096310 | orchestrator | 00:01:43.096 STDOUT terraform:  + stateful = (known after apply) 2025-09-27 00:01:43.096337 | orchestrator | 00:01:43.096 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.096344 | orchestrator | 00:01:43.096 STDOUT terraform:  } 2025-09-27 00:01:43.096389 | orchestrator | 00:01:43.096 STDOUT terraform:  # openstack_networking_secgroup_v2.security_group_node will be created 2025-09-27 00:01:43.096436 | orchestrator | 00:01:43.096 STDOUT terraform:  + resource "openstack_networking_secgroup_v2" "security_group_node" { 2025-09-27 00:01:43.096462 | orchestrator | 00:01:43.096 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.096489 | orchestrator | 00:01:43.096 STDOUT terraform:  + description = "node security group" 2025-09-27 00:01:43.096516 | orchestrator | 00:01:43.096 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.096535 | orchestrator | 00:01:43.096 STDOUT terraform:  + name = "testbed-node" 2025-09-27 00:01:43.096563 | orchestrator | 00:01:43.096 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.096590 | orchestrator | 00:01:43.096 STDOUT terraform:  + stateful = (known after apply) 2025-09-27 00:01:43.096617 | orchestrator | 00:01:43.096 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.096624 | orchestrator | 00:01:43.096 STDOUT terraform:  } 2025-09-27 00:01:43.096668 | orchestrator | 00:01:43.096 STDOUT terraform:  # openstack_networking_subnet_v2.subnet_management will be created 2025-09-27 00:01:43.096710 | orchestrator | 00:01:43.096 STDOUT terraform:  + resource "openstack_networking_subnet_v2" "subnet_management" { 2025-09-27 00:01:43.096740 | orchestrator | 00:01:43.096 STDOUT terraform:  + all_tags = (known after apply) 2025-09-27 00:01:43.096770 | orchestrator | 00:01:43.096 STDOUT terraform:  + cidr = "192.168.16.0/20" 2025-09-27 00:01:43.096788 | orchestrator | 00:01:43.096 STDOUT terraform:  + dns_nameservers = [ 2025-09-27 00:01:43.096795 | orchestrator | 00:01:43.096 STDOUT terraform:  + "8.8.8.8", 2025-09-27 00:01:43.096805 | orchestrator | 00:01:43.096 STDOUT terraform:  + "9.9.9.9", 2025-09-27 00:01:43.096813 | orchestrator | 00:01:43.096 STDOUT terraform:  ] 2025-09-27 00:01:43.096839 | orchestrator | 00:01:43.096 STDOUT terraform:  + enable_dhcp = true 2025-09-27 00:01:43.096869 | orchestrator | 00:01:43.096 STDOUT terraform:  + gateway_ip = (known after apply) 2025-09-27 00:01:43.096898 | orchestrator | 00:01:43.096 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.096905 | orchestrator | 00:01:43.096 STDOUT terraform:  + ip_version = 4 2025-09-27 00:01:43.096942 | orchestrator | 00:01:43.096 STDOUT terraform:  + ipv6_address_mode = (known after apply) 2025-09-27 00:01:43.096971 | orchestrator | 00:01:43.096 STDOUT terraform:  + ipv6_ra_mode = (known after apply) 2025-09-27 00:01:43.097007 | orchestrator | 00:01:43.096 STDOUT terraform:  + name = "subnet-testbed-management" 2025-09-27 00:01:43.097037 | orchestrator | 00:01:43.096 STDOUT terraform:  + network_id = (known after apply) 2025-09-27 00:01:43.097055 | orchestrator | 00:01:43.097 STDOUT terraform:  + no_gateway = false 2025-09-27 00:01:43.097084 | orchestrator | 00:01:43.097 STDOUT terraform:  + region = (known after apply) 2025-09-27 00:01:43.097113 | orchestrator | 00:01:43.097 STDOUT terraform:  + service_types = (known after apply) 2025-09-27 00:01:43.097243 | orchestrator | 00:01:43.097 STDOUT terraform:  + tenant_id = (known after apply) 2025-09-27 00:01:43.097286 | orchestrator | 00:01:43.097 STDOUT terraform:  + allocation_pool (known after apply) 2025-09-27 00:01:43.097303 | orchestrator | 00:01:43.097 STDOUT terraform:  } 2025-09-27 00:01:43.097317 | orchestrator | 00:01:43.097 STDOUT terraform:  # terraform_data.image will be created 2025-09-27 00:01:43.097335 | orchestrator | 00:01:43.097 STDOUT terraform:  + resource "terraform_data" "image" { 2025-09-27 00:01:43.097347 | orchestrator | 00:01:43.097 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.097357 | orchestrator | 00:01:43.097 STDOUT terraform:  + input = "Ubuntu 24.04" 2025-09-27 00:01:43.097368 | orchestrator | 00:01:43.097 STDOUT terraform:  + output = (known after apply) 2025-09-27 00:01:43.097379 | orchestrator | 00:01:43.097 STDOUT terraform:  } 2025-09-27 00:01:43.097390 | orchestrator | 00:01:43.097 STDOUT terraform:  # terraform_data.image_node will be created 2025-09-27 00:01:43.097406 | orchestrator | 00:01:43.097 STDOUT terraform:  + resource "terraform_data" "image_node" { 2025-09-27 00:01:43.097417 | orchestrator | 00:01:43.097 STDOUT terraform:  + id = (known after apply) 2025-09-27 00:01:43.097436 | orchestrator | 00:01:43.097 STDOUT terraform:  + input = "Ubuntu 24.04" 2025-09-27 00:01:43.097449 | orchestrator | 00:01:43.097 STDOUT terraform:  + output = (known after apply) 2025-09-27 00:01:43.097459 | orchestrator | 00:01:43.097 STDOUT terraform:  } 2025-09-27 00:01:43.097474 | orchestrator | 00:01:43.097 STDOUT terraform: Plan: 64 to add, 0 to change, 0 to destroy. 2025-09-27 00:01:43.097485 | orchestrator | 00:01:43.097 STDOUT terraform: Changes to Outputs: 2025-09-27 00:01:43.097496 | orchestrator | 00:01:43.097 STDOUT terraform:  + manager_address = (sensitive value) 2025-09-27 00:01:43.097507 | orchestrator | 00:01:43.097 STDOUT terraform:  + private_key = (sensitive value) 2025-09-27 00:01:43.267691 | orchestrator | 00:01:43.267 STDOUT terraform: terraform_data.image: Creating... 2025-09-27 00:01:43.268029 | orchestrator | 00:01:43.267 STDOUT terraform: terraform_data.image: Creation complete after 0s [id=77992b6a-d568-9f14-db7f-86a2ff2ecd0a] 2025-09-27 00:01:43.268501 | orchestrator | 00:01:43.268 STDOUT terraform: terraform_data.image_node: Creating... 2025-09-27 00:01:43.271321 | orchestrator | 00:01:43.268 STDOUT terraform: terraform_data.image_node: Creation complete after 0s [id=90fefe7a-a054-bfcd-720a-f2cfe8a1d0dc] 2025-09-27 00:01:43.283594 | orchestrator | 00:01:43.283 STDOUT terraform: data.openstack_images_image_v2.image: Reading... 2025-09-27 00:01:43.290894 | orchestrator | 00:01:43.290 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Creating... 2025-09-27 00:01:43.292167 | orchestrator | 00:01:43.292 STDOUT terraform: openstack_compute_keypair_v2.key: Creating... 2025-09-27 00:01:43.293727 | orchestrator | 00:01:43.293 STDOUT terraform: openstack_networking_network_v2.net_management: Creating... 2025-09-27 00:01:43.295960 | orchestrator | 00:01:43.295 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Creating... 2025-09-27 00:01:43.298523 | orchestrator | 00:01:43.298 STDOUT terraform: data.openstack_images_image_v2.image_node: Reading... 2025-09-27 00:01:43.302611 | orchestrator | 00:01:43.302 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Creating... 2025-09-27 00:01:43.302645 | orchestrator | 00:01:43.302 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Creating... 2025-09-27 00:01:43.302662 | orchestrator | 00:01:43.302 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Creating... 2025-09-27 00:01:43.308007 | orchestrator | 00:01:43.307 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Creating... 2025-09-27 00:01:43.746190 | orchestrator | 00:01:43.743 STDOUT terraform: data.openstack_images_image_v2.image: Read complete after 1s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2025-09-27 00:01:43.751273 | orchestrator | 00:01:43.751 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Creating... 2025-09-27 00:01:43.757099 | orchestrator | 00:01:43.756 STDOUT terraform: data.openstack_images_image_v2.image_node: Read complete after 1s [id=846820b2-039e-4b42-adad-daf72e0f8ea4] 2025-09-27 00:01:43.765577 | orchestrator | 00:01:43.765 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Creating... 2025-09-27 00:01:43.785424 | orchestrator | 00:01:43.785 STDOUT terraform: openstack_compute_keypair_v2.key: Creation complete after 1s [id=testbed] 2025-09-27 00:01:43.790570 | orchestrator | 00:01:43.790 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Creating... 2025-09-27 00:01:44.434110 | orchestrator | 00:01:44.433 STDOUT terraform: openstack_networking_network_v2.net_management: Creation complete after 1s [id=3fa1d79a-fd41-4727-bd3e-df3a1d1bb1ff] 2025-09-27 00:01:44.496180 | orchestrator | 00:01:44.446 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Creating... 2025-09-27 00:01:46.919316 | orchestrator | 00:01:46.918 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[4]: Creation complete after 4s [id=88b94aa1-4c02-44af-bedb-78cbed569408] 2025-09-27 00:01:46.928742 | orchestrator | 00:01:46.928 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Creating... 2025-09-27 00:01:46.961847 | orchestrator | 00:01:46.961 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[7]: Creation complete after 4s [id=f6166654-1631-4845-81e5-73fa20742766] 2025-09-27 00:01:46.972916 | orchestrator | 00:01:46.972 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Creating... 2025-09-27 00:01:46.974785 | orchestrator | 00:01:46.974 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[0]: Creation complete after 4s [id=db689dff-d74e-43e3-a305-79ec0de29e1e] 2025-09-27 00:01:46.976317 | orchestrator | 00:01:46.976 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[2]: Creation complete after 4s [id=3491b7a4-1f4d-422d-b24b-7572a092bd2f] 2025-09-27 00:01:46.979549 | orchestrator | 00:01:46.979 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Creating... 2025-09-27 00:01:46.980986 | orchestrator | 00:01:46.980 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Creating... 2025-09-27 00:01:46.998806 | orchestrator | 00:01:46.998 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[5]: Creation complete after 4s [id=06352aa6-6cdc-4b09-96e0-787a93e7d706] 2025-09-27 00:01:47.007219 | orchestrator | 00:01:47.007 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Creating... 2025-09-27 00:01:47.036323 | orchestrator | 00:01:47.036 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[6]: Creation complete after 3s [id=09efdf41-dbe9-4aba-b0d6-c49a377077cc] 2025-09-27 00:01:47.043392 | orchestrator | 00:01:47.043 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Creating... 2025-09-27 00:01:47.055196 | orchestrator | 00:01:47.054 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[3]: Creation complete after 3s [id=aa54db64-5ca4-4f56-bafa-5b00a4002696] 2025-09-27 00:01:47.070568 | orchestrator | 00:01:47.070 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[8]: Creation complete after 3s [id=44ee43e4-0ad4-479b-91ef-60ee60e7859d] 2025-09-27 00:01:47.071516 | orchestrator | 00:01:47.071 STDOUT terraform: local_file.id_rsa_pub: Creating... 2025-09-27 00:01:47.076960 | orchestrator | 00:01:47.076 STDOUT terraform: local_file.id_rsa_pub: Creation complete after 0s [id=dd982a183a68dfc6cb2ea6dbba51b78953be603f] 2025-09-27 00:01:47.080072 | orchestrator | 00:01:47.079 STDOUT terraform: openstack_blockstorage_volume_v3.node_volume[1]: Creation complete after 4s [id=e258aa1c-ff59-4b5b-956f-d2cfc00f460b] 2025-09-27 00:01:47.085597 | orchestrator | 00:01:47.085 STDOUT terraform: openstack_networking_subnet_v2.subnet_management: Creating... 2025-09-27 00:01:47.086377 | orchestrator | 00:01:47.086 STDOUT terraform: local_sensitive_file.id_rsa: Creating... 2025-09-27 00:01:47.091373 | orchestrator | 00:01:47.091 STDOUT terraform: local_sensitive_file.id_rsa: Creation complete after 0s [id=f91dee5de130368d751bb0e5e39decb8710c16b6] 2025-09-27 00:01:47.788940 | orchestrator | 00:01:47.788 STDOUT terraform: openstack_blockstorage_volume_v3.manager_base_volume[0]: Creation complete after 4s [id=dcfe9540-4ca7-4cd8-a060-175e267c1ca6] 2025-09-27 00:01:48.049547 | orchestrator | 00:01:48.049 STDOUT terraform: openstack_networking_subnet_v2.subnet_management: Creation complete after 1s [id=c406e87e-2222-45ca-bf97-ebe10e871e7a] 2025-09-27 00:01:48.057351 | orchestrator | 00:01:48.057 STDOUT terraform: openstack_networking_router_v2.router: Creating... 2025-09-27 00:01:50.353158 | orchestrator | 00:01:50.352 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[4]: Creation complete after 3s [id=ccfdba47-3be1-47ca-9d9b-c14dc21688e2] 2025-09-27 00:01:50.406488 | orchestrator | 00:01:50.406 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[1]: Creation complete after 3s [id=a9409f28-5c60-4825-964c-57b4bb65617a] 2025-09-27 00:01:50.410981 | orchestrator | 00:01:50.410 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[5]: Creation complete after 3s [id=140074d2-f452-403c-a39b-9b7f03d301d0] 2025-09-27 00:01:50.451435 | orchestrator | 00:01:50.451 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[2]: Creation complete after 3s [id=6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0] 2025-09-27 00:01:50.456962 | orchestrator | 00:01:50.456 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[0]: Creation complete after 3s [id=0d8bc224-c074-480c-a812-f41f761871f6] 2025-09-27 00:01:50.471709 | orchestrator | 00:01:50.471 STDOUT terraform: openstack_blockstorage_volume_v3.node_base_volume[3]: Creation complete after 3s [id=9830fc06-72ca-4b97-ae11-006364930d3a] 2025-09-27 00:01:51.212963 | orchestrator | 00:01:51.212 STDOUT terraform: openstack_networking_router_v2.router: Creation complete after 3s [id=ab4b5332-5872-4337-9c50-09f513820f7f] 2025-09-27 00:01:51.221314 | orchestrator | 00:01:51.220 STDOUT terraform: openstack_networking_secgroup_v2.security_group_management: Creating... 2025-09-27 00:01:51.223839 | orchestrator | 00:01:51.223 STDOUT terraform: openstack_networking_router_interface_v2.router_interface: Creating... 2025-09-27 00:01:51.226794 | orchestrator | 00:01:51.226 STDOUT terraform: openstack_networking_secgroup_v2.security_group_node: Creating... 2025-09-27 00:01:51.395303 | orchestrator | 00:01:51.394 STDOUT terraform: openstack_networking_secgroup_v2.security_group_management: Creation complete after 0s [id=a5628a19-a66d-499b-a845-e811c7104adc] 2025-09-27 00:01:51.408062 | orchestrator | 00:01:51.407 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creating... 2025-09-27 00:01:51.408637 | orchestrator | 00:01:51.408 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creating... 2025-09-27 00:01:51.410898 | orchestrator | 00:01:51.410 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creating... 2025-09-27 00:01:51.415550 | orchestrator | 00:01:51.414 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creating... 2025-09-27 00:01:51.417252 | orchestrator | 00:01:51.416 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creating... 2025-09-27 00:01:51.417296 | orchestrator | 00:01:51.417 STDOUT terraform: openstack_networking_port_v2.manager_port_management: Creating... 2025-09-27 00:01:51.431984 | orchestrator | 00:01:51.431 STDOUT terraform: openstack_networking_secgroup_v2.security_group_node: Creation complete after 0s [id=302ac1bf-c543-4213-aae0-e2eb3430e733] 2025-09-27 00:01:51.435023 | orchestrator | 00:01:51.434 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creating... 2025-09-27 00:01:51.435493 | orchestrator | 00:01:51.435 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creating... 2025-09-27 00:01:51.443378 | orchestrator | 00:01:51.443 STDOUT terraform: openstack_networking_port_v2.node_port_management[3]: Creating... 2025-09-27 00:01:51.604843 | orchestrator | 00:01:51.604 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_rule_vrrp: Creation complete after 1s [id=606c446e-b556-43c3-bfb1-86f08daaeb89] 2025-09-27 00:01:51.621684 | orchestrator | 00:01:51.620 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule2: Creation complete after 1s [id=6308a8fa-5275-4ff5-8c24-872bb59d6c40] 2025-09-27 00:01:51.626570 | orchestrator | 00:01:51.626 STDOUT terraform: openstack_networking_port_v2.node_port_management[2]: Creating... 2025-09-27 00:01:51.628557 | orchestrator | 00:01:51.628 STDOUT terraform: openstack_networking_port_v2.node_port_management[5]: Creating... 2025-09-27 00:01:51.782693 | orchestrator | 00:01:51.782 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule3: Creation complete after 1s [id=ad2c9f5a-f9c3-4984-96c8-cf550329cea7] 2025-09-27 00:01:51.796734 | orchestrator | 00:01:51.796 STDOUT terraform: openstack_networking_port_v2.node_port_management[4]: Creating... 2025-09-27 00:01:51.901635 | orchestrator | 00:01:51.901 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule1: Creation complete after 1s [id=bc91a5af-9576-4244-9c33-7ebd04978946] 2025-09-27 00:01:51.915810 | orchestrator | 00:01:51.915 STDOUT terraform: openstack_networking_port_v2.node_port_management[1]: Creating... 2025-09-27 00:01:52.126703 | orchestrator | 00:01:52.126 STDOUT terraform: openstack_networking_port_v2.node_port_management[3]: Creation complete after 1s [id=7ceef1a8-0eff-45fe-a6a0-65d60cf74770] 2025-09-27 00:01:52.142695 | orchestrator | 00:01:52.142 STDOUT terraform: openstack_networking_port_v2.node_port_management[0]: Creating... 2025-09-27 00:01:52.145647 | orchestrator | 00:01:52.145 STDOUT terraform: openstack_networking_port_v2.manager_port_management: Creation complete after 1s [id=75825c42-c538-476e-933f-51f6cabe9e10] 2025-09-27 00:01:52.152051 | orchestrator | 00:01:52.151 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creating... 2025-09-27 00:01:52.349758 | orchestrator | 00:01:52.349 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule2: Creation complete after 0s [id=0abf65d6-88e5-4cc8-a313-1eb17a1ae1d1] 2025-09-27 00:01:52.355311 | orchestrator | 00:01:52.355 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creating... 2025-09-27 00:01:52.389768 | orchestrator | 00:01:52.389 STDOUT terraform: openstack_networking_port_v2.node_port_management[2]: Creation complete after 0s [id=e91cb4df-06df-4852-805a-cdc5aac64a4a] 2025-09-27 00:01:52.401200 | orchestrator | 00:01:52.400 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule4: Creation complete after 1s [id=020b2d6e-f4b5-4acd-b9c9-f32da3bd9812] 2025-09-27 00:01:52.512683 | orchestrator | 00:01:52.512 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_node_rule1: Creation complete after 1s [id=3f487420-dc58-49fd-b951-e7537af76549] 2025-09-27 00:01:52.555549 | orchestrator | 00:01:52.555 STDOUT terraform: openstack_networking_port_v2.node_port_management[5]: Creation complete after 1s [id=47873cf6-2a68-4d7f-a86d-564687dd5aa4] 2025-09-27 00:01:52.643354 | orchestrator | 00:01:52.643 STDOUT terraform: openstack_networking_port_v2.node_port_management[1]: Creation complete after 1s [id=7fff1363-c11b-49bf-917d-b5d5d67aa8d4] 2025-09-27 00:01:52.665338 | orchestrator | 00:01:52.665 STDOUT terraform: openstack_networking_port_v2.node_port_management[4]: Creation complete after 1s [id=fdee2298-3643-4c99-b472-dc41777fdaef] 2025-09-27 00:01:52.973743 | orchestrator | 00:01:52.973 STDOUT terraform: openstack_networking_port_v2.node_port_management[0]: Creation complete after 1s [id=91107a2d-d933-4c00-8bef-7f70db12714a] 2025-09-27 00:01:53.549965 | orchestrator | 00:01:53.549 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule3: Creation complete after 3s [id=fb0e03b7-fe37-4144-96b7-d1982d2e1d03] 2025-09-27 00:01:53.721269 | orchestrator | 00:01:53.720 STDOUT terraform: openstack_networking_secgroup_rule_v2.security_group_management_rule5: Creation complete after 3s [id=59b18556-5b87-40b8-9c3f-662d0ead1fd6] 2025-09-27 00:01:54.031487 | orchestrator | 00:01:54.031 STDOUT terraform: openstack_networking_router_interface_v2.router_interface: Creation complete after 3s [id=cb5019b0-19d1-4efc-9743-61fed748e01e] 2025-09-27 00:01:54.058451 | orchestrator | 00:01:54.058 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Creating... 2025-09-27 00:01:54.062185 | orchestrator | 00:01:54.062 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Creating... 2025-09-27 00:01:54.063206 | orchestrator | 00:01:54.063 STDOUT terraform: openstack_networking_floatingip_v2.manager_floating_ip: Creating... 2025-09-27 00:01:54.064469 | orchestrator | 00:01:54.063 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Creating... 2025-09-27 00:01:54.066526 | orchestrator | 00:01:54.066 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Creating... 2025-09-27 00:01:54.081912 | orchestrator | 00:01:54.081 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Creating... 2025-09-27 00:01:54.083531 | orchestrator | 00:01:54.083 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Creating... 2025-09-27 00:01:55.449609 | orchestrator | 00:01:55.449 STDOUT terraform: openstack_networking_floatingip_v2.manager_floating_ip: Creation complete after 1s [id=e26df3c8-e62c-447a-b6fd-c89319430431] 2025-09-27 00:01:55.455997 | orchestrator | 00:01:55.455 STDOUT terraform: openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creating... 2025-09-27 00:01:55.461835 | orchestrator | 00:01:55.461 STDOUT terraform: local_file.inventory: Creating... 2025-09-27 00:01:55.467993 | orchestrator | 00:01:55.467 STDOUT terraform: local_file.MANAGER_ADDRESS: Creating... 2025-09-27 00:01:55.472381 | orchestrator | 00:01:55.472 STDOUT terraform: local_file.MANAGER_ADDRESS: Creation complete after 0s [id=4ece67e5f9b5d7b0aadbeed36313139e691c686e] 2025-09-27 00:01:55.476108 | orchestrator | 00:01:55.475 STDOUT terraform: local_file.inventory: Creation complete after 0s [id=b91fd14604e7cbff4658e6c2802388391487ce9c] 2025-09-27 00:01:56.701112 | orchestrator | 00:01:56.700 STDOUT terraform: openstack_networking_floatingip_associate_v2.manager_floating_ip_association: Creation complete after 2s [id=e26df3c8-e62c-447a-b6fd-c89319430431] 2025-09-27 00:02:04.060656 | orchestrator | 00:02:04.060 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [10s elapsed] 2025-09-27 00:02:04.064709 | orchestrator | 00:02:04.064 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [10s elapsed] 2025-09-27 00:02:04.064774 | orchestrator | 00:02:04.064 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [10s elapsed] 2025-09-27 00:02:04.067890 | orchestrator | 00:02:04.067 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [10s elapsed] 2025-09-27 00:02:04.082413 | orchestrator | 00:02:04.082 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [10s elapsed] 2025-09-27 00:02:04.086860 | orchestrator | 00:02:04.086 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [10s elapsed] 2025-09-27 00:02:14.061596 | orchestrator | 00:02:14.061 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Still creating... [20s elapsed] 2025-09-27 00:02:14.065655 | orchestrator | 00:02:14.065 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [20s elapsed] 2025-09-27 00:02:14.065949 | orchestrator | 00:02:14.065 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Still creating... [20s elapsed] 2025-09-27 00:02:14.068921 | orchestrator | 00:02:14.068 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Still creating... [20s elapsed] 2025-09-27 00:02:14.083233 | orchestrator | 00:02:14.083 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Still creating... [20s elapsed] 2025-09-27 00:02:14.087580 | orchestrator | 00:02:14.087 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [20s elapsed] 2025-09-27 00:02:14.824932 | orchestrator | 00:02:14.824 STDOUT terraform: openstack_compute_instance_v2.node_server[0]: Creation complete after 21s [id=5b3867c7-26a7-4b92-a32d-39b71a625a07] 2025-09-27 00:02:14.831630 | orchestrator | 00:02:14.831 STDOUT terraform: openstack_compute_instance_v2.node_server[1]: Creation complete after 21s [id=1edb60f4-80d4-4fc7-a2bc-2ad61dfbd932] 2025-09-27 00:02:14.950200 | orchestrator | 00:02:14.949 STDOUT terraform: openstack_compute_instance_v2.node_server[3]: Creation complete after 21s [id=43996d3c-c809-4dd4-8535-c6f0e695ee04] 2025-09-27 00:02:15.380398 | orchestrator | 00:02:15.379 STDOUT terraform: openstack_compute_instance_v2.node_server[4]: Creation complete after 21s [id=a0277d4e-0848-420e-bfd9-7ae31d4ade25] 2025-09-27 00:02:24.066845 | orchestrator | 00:02:24.066 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Still creating... [30s elapsed] 2025-09-27 00:02:24.087957 | orchestrator | 00:02:24.087 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Still creating... [30s elapsed] 2025-09-27 00:02:25.175636 | orchestrator | 00:02:25.175 STDOUT terraform: openstack_compute_instance_v2.node_server[2]: Creation complete after 31s [id=fbcc9cc5-5c99-43ac-9d39-8070f22e6e8d] 2025-09-27 00:02:25.743488 | orchestrator | 00:02:25.743 STDOUT terraform: openstack_compute_instance_v2.node_server[5]: Creation complete after 32s [id=65591551-f935-47c6-9f23-d32d473dca71] 2025-09-27 00:02:25.779570 | orchestrator | 00:02:25.779 STDOUT terraform: null_resource.node_semaphore: Creating... 2025-09-27 00:02:25.782901 | orchestrator | 00:02:25.782 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creating... 2025-09-27 00:02:25.782941 | orchestrator | 00:02:25.782 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creating... 2025-09-27 00:02:25.783582 | orchestrator | 00:02:25.783 STDOUT terraform: null_resource.node_semaphore: Creation complete after 0s [id=9071923335424951675] 2025-09-27 00:02:25.784231 | orchestrator | 00:02:25.783 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creating... 2025-09-27 00:02:25.784774 | orchestrator | 00:02:25.784 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creating... 2025-09-27 00:02:25.785559 | orchestrator | 00:02:25.785 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creating... 2025-09-27 00:02:25.787758 | orchestrator | 00:02:25.787 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creating... 2025-09-27 00:02:25.792591 | orchestrator | 00:02:25.792 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creating... 2025-09-27 00:02:25.798184 | orchestrator | 00:02:25.796 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creating... 2025-09-27 00:02:25.800124 | orchestrator | 00:02:25.800 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creating... 2025-09-27 00:02:25.826646 | orchestrator | 00:02:25.826 STDOUT terraform: openstack_compute_instance_v2.manager_server: Creating... 2025-09-27 00:02:29.172528 | orchestrator | 00:02:29.172 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[2]: Creation complete after 3s [id=65591551-f935-47c6-9f23-d32d473dca71/3491b7a4-1f4d-422d-b24b-7572a092bd2f] 2025-09-27 00:02:29.191109 | orchestrator | 00:02:29.190 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[4]: Creation complete after 3s [id=a0277d4e-0848-420e-bfd9-7ae31d4ade25/88b94aa1-4c02-44af-bedb-78cbed569408] 2025-09-27 00:02:29.210311 | orchestrator | 00:02:29.209 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[5]: Creation complete after 3s [id=65591551-f935-47c6-9f23-d32d473dca71/06352aa6-6cdc-4b09-96e0-787a93e7d706] 2025-09-27 00:02:29.219038 | orchestrator | 00:02:29.218 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[6]: Creation complete after 3s [id=43996d3c-c809-4dd4-8535-c6f0e695ee04/09efdf41-dbe9-4aba-b0d6-c49a377077cc] 2025-09-27 00:02:29.235996 | orchestrator | 00:02:29.235 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[7]: Creation complete after 3s [id=a0277d4e-0848-420e-bfd9-7ae31d4ade25/f6166654-1631-4845-81e5-73fa20742766] 2025-09-27 00:02:29.288396 | orchestrator | 00:02:29.287 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[3]: Creation complete after 3s [id=43996d3c-c809-4dd4-8535-c6f0e695ee04/aa54db64-5ca4-4f56-bafa-5b00a4002696] 2025-09-27 00:02:35.367304 | orchestrator | 00:02:35.366 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[1]: Creation complete after 9s [id=a0277d4e-0848-420e-bfd9-7ae31d4ade25/e258aa1c-ff59-4b5b-956f-d2cfc00f460b] 2025-09-27 00:02:35.378595 | orchestrator | 00:02:35.378 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[0]: Creation complete after 9s [id=43996d3c-c809-4dd4-8535-c6f0e695ee04/db689dff-d74e-43e3-a305-79ec0de29e1e] 2025-09-27 00:02:35.402405 | orchestrator | 00:02:35.401 STDOUT terraform: openstack_compute_volume_attach_v2.node_volume_attachment[8]: Creation complete after 9s [id=65591551-f935-47c6-9f23-d32d473dca71/44ee43e4-0ad4-479b-91ef-60ee60e7859d] 2025-09-27 00:02:35.827297 | orchestrator | 00:02:35.826 STDOUT terraform: openstack_compute_instance_v2.manager_server: Still creating... [10s elapsed] 2025-09-27 00:02:45.827603 | orchestrator | 00:02:45.827 STDOUT terraform: openstack_compute_instance_v2.manager_server: Still creating... [20s elapsed] 2025-09-27 00:02:46.136943 | orchestrator | 00:02:46.136 STDOUT terraform: openstack_compute_instance_v2.manager_server: Creation complete after 20s [id=98d3041b-61ea-482d-84e5-5afc1ce8aac3] 2025-09-27 00:02:46.156957 | orchestrator | 00:02:46.156 STDOUT terraform: Apply complete! Resources: 64 added, 0 changed, 0 destroyed. 2025-09-27 00:02:46.157043 | orchestrator | 00:02:46.156 STDOUT terraform: Outputs: 2025-09-27 00:02:46.157060 | orchestrator | 00:02:46.156 STDOUT terraform: manager_address = 2025-09-27 00:02:46.157084 | orchestrator | 00:02:46.156 STDOUT terraform: private_key = 2025-09-27 00:02:46.404625 | orchestrator | ok: Runtime: 0:01:10.801117 2025-09-27 00:02:46.449726 | 2025-09-27 00:02:46.450058 | TASK [Create infrastructure (stable)] 2025-09-27 00:02:47.007130 | orchestrator | skipping: Conditional result was False 2025-09-27 00:02:47.023356 | 2025-09-27 00:02:47.023510 | TASK [Fetch manager address] 2025-09-27 00:02:47.448541 | orchestrator | ok 2025-09-27 00:02:47.458524 | 2025-09-27 00:02:47.458680 | TASK [Set manager_host address] 2025-09-27 00:02:47.549342 | orchestrator | ok 2025-09-27 00:02:47.559956 | 2025-09-27 00:02:47.560140 | LOOP [Update ansible collections] 2025-09-27 00:02:48.397147 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2025-09-27 00:02:48.397476 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-09-27 00:02:48.397525 | orchestrator | Starting galaxy collection install process 2025-09-27 00:02:48.397557 | orchestrator | Process install dependency map 2025-09-27 00:02:48.397584 | orchestrator | Starting collection install process 2025-09-27 00:02:48.397610 | orchestrator | Installing 'osism.commons:999.0.0' to '/home/zuul-testbed03/.ansible/collections/ansible_collections/osism/commons' 2025-09-27 00:02:48.397640 | orchestrator | Created collection for osism.commons:999.0.0 at /home/zuul-testbed03/.ansible/collections/ansible_collections/osism/commons 2025-09-27 00:02:48.397672 | orchestrator | osism.commons:999.0.0 was installed successfully 2025-09-27 00:02:48.397739 | orchestrator | ok: Item: commons Runtime: 0:00:00.493501 2025-09-27 00:02:49.271718 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-09-27 00:02:49.271870 | orchestrator | [WARNING]: Collection osism.services does not support Ansible version 2.15.2 2025-09-27 00:02:49.271904 | orchestrator | Starting galaxy collection install process 2025-09-27 00:02:49.271928 | orchestrator | Process install dependency map 2025-09-27 00:02:49.271950 | orchestrator | Starting collection install process 2025-09-27 00:02:49.271969 | orchestrator | Installing 'osism.services:999.0.0' to '/home/zuul-testbed03/.ansible/collections/ansible_collections/osism/services' 2025-09-27 00:02:49.271990 | orchestrator | Created collection for osism.services:999.0.0 at /home/zuul-testbed03/.ansible/collections/ansible_collections/osism/services 2025-09-27 00:02:49.272009 | orchestrator | osism.services:999.0.0 was installed successfully 2025-09-27 00:02:49.272040 | orchestrator | ok: Item: services Runtime: 0:00:00.638331 2025-09-27 00:02:49.290111 | 2025-09-27 00:02:49.290258 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2025-09-27 00:02:59.826320 | orchestrator | ok 2025-09-27 00:02:59.841589 | 2025-09-27 00:02:59.841754 | TASK [Wait a little longer for the manager so that everything is ready] 2025-09-27 00:03:59.886361 | orchestrator | ok 2025-09-27 00:03:59.896637 | 2025-09-27 00:03:59.896755 | TASK [Fetch manager ssh hostkey] 2025-09-27 00:04:01.465442 | orchestrator | Output suppressed because no_log was given 2025-09-27 00:04:01.472953 | 2025-09-27 00:04:01.473089 | TASK [Get ssh keypair from terraform environment] 2025-09-27 00:04:02.006402 | orchestrator | ok: Runtime: 0:00:00.008171 2025-09-27 00:04:02.018390 | 2025-09-27 00:04:02.018543 | TASK [Point out that the following task takes some time and does not give any output] 2025-09-27 00:04:02.052737 | orchestrator | ok: The task 'Run manager part 0' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minutes for this task to complete. 2025-09-27 00:04:02.063238 | 2025-09-27 00:04:02.063369 | TASK [Run manager part 0] 2025-09-27 00:04:02.866311 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-09-27 00:04:02.911825 | orchestrator | 2025-09-27 00:04:02.911870 | orchestrator | PLAY [Wait for cloud-init to finish] ******************************************* 2025-09-27 00:04:02.911877 | orchestrator | 2025-09-27 00:04:02.911890 | orchestrator | TASK [Check /var/lib/cloud/instance/boot-finished] ***************************** 2025-09-27 00:04:04.614094 | orchestrator | ok: [testbed-manager] 2025-09-27 00:04:04.614143 | orchestrator | 2025-09-27 00:04:04.614187 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2025-09-27 00:04:04.614200 | orchestrator | 2025-09-27 00:04:04.614211 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:04:06.312216 | orchestrator | ok: [testbed-manager] 2025-09-27 00:04:06.312270 | orchestrator | 2025-09-27 00:04:06.312294 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2025-09-27 00:04:06.940856 | orchestrator | ok: [testbed-manager] 2025-09-27 00:04:06.941004 | orchestrator | 2025-09-27 00:04:06.941023 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2025-09-27 00:04:06.977664 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:04:06.977749 | orchestrator | 2025-09-27 00:04:06.977773 | orchestrator | TASK [Update package cache] **************************************************** 2025-09-27 00:04:06.998933 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:04:06.998979 | orchestrator | 2025-09-27 00:04:06.998989 | orchestrator | TASK [Install required packages] *********************************************** 2025-09-27 00:04:07.022525 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:04:07.022573 | orchestrator | 2025-09-27 00:04:07.022583 | orchestrator | TASK [Remove some python packages] ********************************************* 2025-09-27 00:04:07.046447 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:04:07.046468 | orchestrator | 2025-09-27 00:04:07.046473 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2025-09-27 00:04:07.070058 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:04:07.070103 | orchestrator | 2025-09-27 00:04:07.070114 | orchestrator | TASK [Fail if Ubuntu version is lower than 22.04] ****************************** 2025-09-27 00:04:07.095853 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:04:07.095884 | orchestrator | 2025-09-27 00:04:07.095891 | orchestrator | TASK [Fail if Debian version is lower than 12] ********************************* 2025-09-27 00:04:07.124932 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:04:07.124960 | orchestrator | 2025-09-27 00:04:07.124967 | orchestrator | TASK [Set APT options on manager] ********************************************** 2025-09-27 00:04:07.787114 | orchestrator | changed: [testbed-manager] 2025-09-27 00:04:07.787182 | orchestrator | 2025-09-27 00:04:07.787192 | orchestrator | TASK [Update APT cache and run dist-upgrade] *********************************** 2025-09-27 00:06:55.997770 | orchestrator | changed: [testbed-manager] 2025-09-27 00:06:55.997839 | orchestrator | 2025-09-27 00:06:55.997856 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2025-09-27 00:08:20.920466 | orchestrator | changed: [testbed-manager] 2025-09-27 00:08:20.920582 | orchestrator | 2025-09-27 00:08:20.920610 | orchestrator | TASK [Install required packages] *********************************************** 2025-09-27 00:08:45.324260 | orchestrator | changed: [testbed-manager] 2025-09-27 00:08:45.324358 | orchestrator | 2025-09-27 00:08:45.324379 | orchestrator | TASK [Remove some python packages] ********************************************* 2025-09-27 00:08:54.522165 | orchestrator | changed: [testbed-manager] 2025-09-27 00:08:54.522281 | orchestrator | 2025-09-27 00:08:54.522296 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2025-09-27 00:08:54.571243 | orchestrator | ok: [testbed-manager] 2025-09-27 00:08:54.571311 | orchestrator | 2025-09-27 00:08:54.571326 | orchestrator | TASK [Get current user] ******************************************************** 2025-09-27 00:08:57.041895 | orchestrator | ok: [testbed-manager] 2025-09-27 00:08:57.041991 | orchestrator | 2025-09-27 00:08:57.042009 | orchestrator | TASK [Create venv directory] *************************************************** 2025-09-27 00:08:57.770538 | orchestrator | changed: [testbed-manager] 2025-09-27 00:08:57.770629 | orchestrator | 2025-09-27 00:08:57.770645 | orchestrator | TASK [Install netaddr in venv] ************************************************* 2025-09-27 00:09:04.004471 | orchestrator | changed: [testbed-manager] 2025-09-27 00:09:04.004572 | orchestrator | 2025-09-27 00:09:04.004615 | orchestrator | TASK [Install ansible-core in venv] ******************************************** 2025-09-27 00:09:09.988536 | orchestrator | changed: [testbed-manager] 2025-09-27 00:09:09.988635 | orchestrator | 2025-09-27 00:09:09.988653 | orchestrator | TASK [Install requests >= 2.32.2] ********************************************** 2025-09-27 00:09:12.654798 | orchestrator | changed: [testbed-manager] 2025-09-27 00:09:12.655362 | orchestrator | 2025-09-27 00:09:12.655548 | orchestrator | TASK [Install docker >= 7.1.0] ************************************************* 2025-09-27 00:09:14.487260 | orchestrator | changed: [testbed-manager] 2025-09-27 00:09:14.487320 | orchestrator | 2025-09-27 00:09:14.487329 | orchestrator | TASK [Create directories in /opt/src] ****************************************** 2025-09-27 00:09:15.618916 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2025-09-27 00:09:15.618989 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2025-09-27 00:09:15.619003 | orchestrator | 2025-09-27 00:09:15.619015 | orchestrator | TASK [Sync sources in /opt/src] ************************************************ 2025-09-27 00:09:15.661317 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2025-09-27 00:09:15.661359 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2025-09-27 00:09:15.661365 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2025-09-27 00:09:15.661369 | orchestrator | deprecation_warnings=False in ansible.cfg. 2025-09-27 00:09:18.693216 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-commons) 2025-09-27 00:09:18.693324 | orchestrator | changed: [testbed-manager] => (item=osism/ansible-collection-services) 2025-09-27 00:09:18.693339 | orchestrator | 2025-09-27 00:09:18.693352 | orchestrator | TASK [Create /usr/share/ansible directory] ************************************* 2025-09-27 00:09:19.256983 | orchestrator | changed: [testbed-manager] 2025-09-27 00:09:19.257028 | orchestrator | 2025-09-27 00:09:19.257036 | orchestrator | TASK [Install collections from Ansible galaxy] ********************************* 2025-09-27 00:11:38.912647 | orchestrator | changed: [testbed-manager] => (item=ansible.netcommon) 2025-09-27 00:11:38.912757 | orchestrator | changed: [testbed-manager] => (item=ansible.posix) 2025-09-27 00:11:38.912775 | orchestrator | changed: [testbed-manager] => (item=community.docker>=3.10.2) 2025-09-27 00:11:38.912787 | orchestrator | 2025-09-27 00:11:38.912799 | orchestrator | TASK [Install local collections] *********************************************** 2025-09-27 00:11:41.242820 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-commons) 2025-09-27 00:11:41.242910 | orchestrator | changed: [testbed-manager] => (item=ansible-collection-services) 2025-09-27 00:11:41.242925 | orchestrator | 2025-09-27 00:11:41.242938 | orchestrator | PLAY [Create operator user] **************************************************** 2025-09-27 00:11:41.242950 | orchestrator | 2025-09-27 00:11:41.242962 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:11:42.620623 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:42.620717 | orchestrator | 2025-09-27 00:11:42.620735 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2025-09-27 00:11:42.667356 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:42.667397 | orchestrator | 2025-09-27 00:11:42.667406 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2025-09-27 00:11:42.731204 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:42.731301 | orchestrator | 2025-09-27 00:11:42.731317 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2025-09-27 00:11:43.531039 | orchestrator | changed: [testbed-manager] 2025-09-27 00:11:43.531124 | orchestrator | 2025-09-27 00:11:43.531139 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2025-09-27 00:11:44.231693 | orchestrator | changed: [testbed-manager] 2025-09-27 00:11:44.231778 | orchestrator | 2025-09-27 00:11:44.231795 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2025-09-27 00:11:45.606189 | orchestrator | changed: [testbed-manager] => (item=adm) 2025-09-27 00:11:45.606300 | orchestrator | changed: [testbed-manager] => (item=sudo) 2025-09-27 00:11:45.606316 | orchestrator | 2025-09-27 00:11:45.606342 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2025-09-27 00:11:46.977931 | orchestrator | changed: [testbed-manager] 2025-09-27 00:11:46.978067 | orchestrator | 2025-09-27 00:11:46.978087 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2025-09-27 00:11:48.657318 | orchestrator | changed: [testbed-manager] => (item=export LANGUAGE=C.UTF-8) 2025-09-27 00:11:48.657408 | orchestrator | changed: [testbed-manager] => (item=export LANG=C.UTF-8) 2025-09-27 00:11:48.657424 | orchestrator | changed: [testbed-manager] => (item=export LC_ALL=C.UTF-8) 2025-09-27 00:11:48.657435 | orchestrator | 2025-09-27 00:11:48.657449 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2025-09-27 00:11:48.712581 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:11:48.712651 | orchestrator | 2025-09-27 00:11:48.712666 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2025-09-27 00:11:49.246992 | orchestrator | changed: [testbed-manager] 2025-09-27 00:11:49.247032 | orchestrator | 2025-09-27 00:11:49.247041 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2025-09-27 00:11:49.312005 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:11:49.312045 | orchestrator | 2025-09-27 00:11:49.312054 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2025-09-27 00:11:50.136969 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-27 00:11:50.137011 | orchestrator | changed: [testbed-manager] 2025-09-27 00:11:50.137020 | orchestrator | 2025-09-27 00:11:50.137027 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2025-09-27 00:11:50.177849 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:11:50.177884 | orchestrator | 2025-09-27 00:11:50.177891 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2025-09-27 00:11:50.218280 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:11:50.218344 | orchestrator | 2025-09-27 00:11:50.218357 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2025-09-27 00:11:50.255056 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:11:50.255111 | orchestrator | 2025-09-27 00:11:50.255124 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2025-09-27 00:11:50.323480 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:11:50.323562 | orchestrator | 2025-09-27 00:11:50.323582 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2025-09-27 00:11:51.041320 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:51.042149 | orchestrator | 2025-09-27 00:11:51.042173 | orchestrator | PLAY [Run manager part 0] ****************************************************** 2025-09-27 00:11:51.042185 | orchestrator | 2025-09-27 00:11:51.042196 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:11:52.480977 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:52.481063 | orchestrator | 2025-09-27 00:11:52.481078 | orchestrator | TASK [Recursively change ownership of /opt/venv] ******************************* 2025-09-27 00:11:53.431131 | orchestrator | changed: [testbed-manager] 2025-09-27 00:11:53.431166 | orchestrator | 2025-09-27 00:11:53.431172 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:11:53.431177 | orchestrator | testbed-manager : ok=33 changed=23 unreachable=0 failed=0 skipped=13 rescued=0 ignored=0 2025-09-27 00:11:53.431182 | orchestrator | 2025-09-27 00:11:53.877689 | orchestrator | ok: Runtime: 0:07:51.175083 2025-09-27 00:11:53.895896 | 2025-09-27 00:11:53.896036 | TASK [Point out that the log in on the manager is now possible] 2025-09-27 00:11:53.942727 | orchestrator | ok: It is now already possible to log in to the manager with 'make login'. 2025-09-27 00:11:53.952004 | 2025-09-27 00:11:53.952123 | TASK [Point out that the following task takes some time and does not give any output] 2025-09-27 00:11:53.984257 | orchestrator | ok: The task 'Run manager part 1 + 2' runs an Ansible playbook on the manager. There is no further output of this here. It takes a few minuts for this task to complete. 2025-09-27 00:11:53.992016 | 2025-09-27 00:11:53.992124 | TASK [Run manager part 1 + 2] 2025-09-27 00:11:54.899719 | orchestrator | [WARNING]: Collection osism.commons does not support Ansible version 2.15.2 2025-09-27 00:11:54.955277 | orchestrator | 2025-09-27 00:11:54.955343 | orchestrator | PLAY [Run manager part 1] ****************************************************** 2025-09-27 00:11:54.955359 | orchestrator | 2025-09-27 00:11:54.955386 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:11:57.946430 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:57.946562 | orchestrator | 2025-09-27 00:11:57.946621 | orchestrator | TASK [Set venv_command fact (RedHat)] ****************************************** 2025-09-27 00:11:57.983290 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:11:57.983331 | orchestrator | 2025-09-27 00:11:57.983341 | orchestrator | TASK [Set venv_command fact (Debian)] ****************************************** 2025-09-27 00:11:58.019428 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:58.019462 | orchestrator | 2025-09-27 00:11:58.019472 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-09-27 00:11:58.057705 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:58.057736 | orchestrator | 2025-09-27 00:11:58.057743 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-09-27 00:11:58.119555 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:58.119586 | orchestrator | 2025-09-27 00:11:58.119592 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-09-27 00:11:58.173135 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:58.173167 | orchestrator | 2025-09-27 00:11:58.173173 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-09-27 00:11:58.214151 | orchestrator | included: /home/zuul-testbed03/.ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager 2025-09-27 00:11:58.214177 | orchestrator | 2025-09-27 00:11:58.214182 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-09-27 00:11:58.886603 | orchestrator | ok: [testbed-manager] 2025-09-27 00:11:58.886650 | orchestrator | 2025-09-27 00:11:58.886659 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-09-27 00:11:58.928386 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:11:58.928431 | orchestrator | 2025-09-27 00:11:58.928440 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-09-27 00:12:00.218686 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:00.218726 | orchestrator | 2025-09-27 00:12:00.218736 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-09-27 00:12:00.778052 | orchestrator | ok: [testbed-manager] 2025-09-27 00:12:00.778125 | orchestrator | 2025-09-27 00:12:00.778140 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-09-27 00:12:01.883609 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:01.883683 | orchestrator | 2025-09-27 00:12:01.883701 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-09-27 00:12:17.805456 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:17.805503 | orchestrator | 2025-09-27 00:12:17.805510 | orchestrator | TASK [Get home directory of ansible user] ************************************** 2025-09-27 00:12:18.438057 | orchestrator | ok: [testbed-manager] 2025-09-27 00:12:18.438087 | orchestrator | 2025-09-27 00:12:18.438094 | orchestrator | TASK [Set repo_path fact] ****************************************************** 2025-09-27 00:12:18.490730 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:12:18.490758 | orchestrator | 2025-09-27 00:12:18.490763 | orchestrator | TASK [Copy SSH public key] ***************************************************** 2025-09-27 00:12:19.407948 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:19.408005 | orchestrator | 2025-09-27 00:12:19.408018 | orchestrator | TASK [Copy SSH private key] **************************************************** 2025-09-27 00:12:20.332584 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:20.332620 | orchestrator | 2025-09-27 00:12:20.332627 | orchestrator | TASK [Create configuration directory] ****************************************** 2025-09-27 00:12:20.884107 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:20.884140 | orchestrator | 2025-09-27 00:12:20.884146 | orchestrator | TASK [Copy testbed repo] ******************************************************* 2025-09-27 00:12:20.919997 | orchestrator | [DEPRECATION WARNING]: The connection's stdin object is deprecated. Call 2025-09-27 00:12:20.920074 | orchestrator | display.prompt_until(msg) instead. This feature will be removed in version 2025-09-27 00:12:20.920087 | orchestrator | 2.19. Deprecation warnings can be disabled by setting 2025-09-27 00:12:20.920098 | orchestrator | deprecation_warnings=False in ansible.cfg. 2025-09-27 00:12:22.640607 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:22.640694 | orchestrator | 2025-09-27 00:12:22.640710 | orchestrator | TASK [Install python requirements in venv] ************************************* 2025-09-27 00:12:31.106138 | orchestrator | ok: [testbed-manager] => (item=Jinja2) 2025-09-27 00:12:31.106195 | orchestrator | ok: [testbed-manager] => (item=PyYAML) 2025-09-27 00:12:31.106206 | orchestrator | ok: [testbed-manager] => (item=packaging) 2025-09-27 00:12:31.106213 | orchestrator | changed: [testbed-manager] => (item=python-gilt==1.2.3) 2025-09-27 00:12:31.106225 | orchestrator | ok: [testbed-manager] => (item=requests>=2.32.2) 2025-09-27 00:12:31.106232 | orchestrator | ok: [testbed-manager] => (item=docker>=7.1.0) 2025-09-27 00:12:31.106239 | orchestrator | 2025-09-27 00:12:31.106296 | orchestrator | TASK [Copy testbed custom CA certificate on Debian/Ubuntu] ********************* 2025-09-27 00:12:32.032488 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:32.032589 | orchestrator | 2025-09-27 00:12:32.032606 | orchestrator | TASK [Copy testbed custom CA certificate on CentOS] **************************** 2025-09-27 00:12:32.079473 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:12:32.079542 | orchestrator | 2025-09-27 00:12:32.079551 | orchestrator | TASK [Run update-ca-certificates on Debian/Ubuntu] ***************************** 2025-09-27 00:12:34.849919 | orchestrator | changed: [testbed-manager] 2025-09-27 00:12:34.850077 | orchestrator | 2025-09-27 00:12:34.850098 | orchestrator | TASK [Run update-ca-trust on RedHat] ******************************************* 2025-09-27 00:12:34.894130 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:12:34.894195 | orchestrator | 2025-09-27 00:12:34.894214 | orchestrator | TASK [Run manager part 2] ****************************************************** 2025-09-27 00:14:07.948643 | orchestrator | changed: [testbed-manager] 2025-09-27 00:14:07.948761 | orchestrator | 2025-09-27 00:14:07.948780 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-09-27 00:14:09.105213 | orchestrator | ok: [testbed-manager] 2025-09-27 00:14:09.105276 | orchestrator | 2025-09-27 00:14:09.105283 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:14:09.105290 | orchestrator | testbed-manager : ok=21 changed=11 unreachable=0 failed=0 skipped=5 rescued=0 ignored=0 2025-09-27 00:14:09.105295 | orchestrator | 2025-09-27 00:14:09.633688 | orchestrator | ok: Runtime: 0:02:14.868816 2025-09-27 00:14:09.651089 | 2025-09-27 00:14:09.651234 | TASK [Reboot manager] 2025-09-27 00:14:11.186940 | orchestrator | ok: Runtime: 0:00:00.935136 2025-09-27 00:14:11.203319 | 2025-09-27 00:14:11.203492 | TASK [Wait up to 300 seconds for port 22 to become open and contain "OpenSSH"] 2025-09-27 00:14:26.356815 | orchestrator | ok 2025-09-27 00:14:26.368125 | 2025-09-27 00:14:26.368254 | TASK [Wait a little longer for the manager so that everything is ready] 2025-09-27 00:15:26.420927 | orchestrator | ok 2025-09-27 00:15:26.431475 | 2025-09-27 00:15:26.431598 | TASK [Deploy manager + bootstrap nodes] 2025-09-27 00:15:28.994170 | orchestrator | 2025-09-27 00:15:28.994381 | orchestrator | # DEPLOY MANAGER 2025-09-27 00:15:28.994407 | orchestrator | 2025-09-27 00:15:28.994422 | orchestrator | + set -e 2025-09-27 00:15:28.994435 | orchestrator | + echo 2025-09-27 00:15:28.994450 | orchestrator | + echo '# DEPLOY MANAGER' 2025-09-27 00:15:28.994467 | orchestrator | + echo 2025-09-27 00:15:28.994510 | orchestrator | + cat /opt/manager-vars.sh 2025-09-27 00:15:28.998078 | orchestrator | export NUMBER_OF_NODES=6 2025-09-27 00:15:28.998102 | orchestrator | 2025-09-27 00:15:28.998115 | orchestrator | export CEPH_VERSION=reef 2025-09-27 00:15:28.998128 | orchestrator | export CONFIGURATION_VERSION=main 2025-09-27 00:15:28.998140 | orchestrator | export MANAGER_VERSION=latest 2025-09-27 00:15:28.998161 | orchestrator | export OPENSTACK_VERSION=2024.2 2025-09-27 00:15:28.998172 | orchestrator | 2025-09-27 00:15:28.998190 | orchestrator | export ARA=false 2025-09-27 00:15:28.998202 | orchestrator | export DEPLOY_MODE=manager 2025-09-27 00:15:28.998219 | orchestrator | export TEMPEST=true 2025-09-27 00:15:28.998230 | orchestrator | export IS_ZUUL=true 2025-09-27 00:15:28.998242 | orchestrator | 2025-09-27 00:15:28.998259 | orchestrator | export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.20 2025-09-27 00:15:28.998294 | orchestrator | export EXTERNAL_API=false 2025-09-27 00:15:28.998305 | orchestrator | 2025-09-27 00:15:28.998316 | orchestrator | export IMAGE_USER=ubuntu 2025-09-27 00:15:28.998331 | orchestrator | export IMAGE_NODE_USER=ubuntu 2025-09-27 00:15:28.998342 | orchestrator | 2025-09-27 00:15:28.998352 | orchestrator | export CEPH_STACK=ceph-ansible 2025-09-27 00:15:28.998592 | orchestrator | 2025-09-27 00:15:28.998609 | orchestrator | + echo 2025-09-27 00:15:28.998625 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-09-27 00:15:29.000375 | orchestrator | ++ export INTERACTIVE=false 2025-09-27 00:15:29.000394 | orchestrator | ++ INTERACTIVE=false 2025-09-27 00:15:29.000408 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-09-27 00:15:29.000420 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-09-27 00:15:29.000492 | orchestrator | + source /opt/manager-vars.sh 2025-09-27 00:15:29.000506 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-09-27 00:15:29.000518 | orchestrator | ++ NUMBER_OF_NODES=6 2025-09-27 00:15:29.000528 | orchestrator | ++ export CEPH_VERSION=reef 2025-09-27 00:15:29.000539 | orchestrator | ++ CEPH_VERSION=reef 2025-09-27 00:15:29.000555 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-09-27 00:15:29.000566 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-09-27 00:15:29.000577 | orchestrator | ++ export MANAGER_VERSION=latest 2025-09-27 00:15:29.000588 | orchestrator | ++ MANAGER_VERSION=latest 2025-09-27 00:15:29.000599 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2025-09-27 00:15:29.000617 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2025-09-27 00:15:29.000628 | orchestrator | ++ export ARA=false 2025-09-27 00:15:29.000639 | orchestrator | ++ ARA=false 2025-09-27 00:15:29.000650 | orchestrator | ++ export DEPLOY_MODE=manager 2025-09-27 00:15:29.000661 | orchestrator | ++ DEPLOY_MODE=manager 2025-09-27 00:15:29.000671 | orchestrator | ++ export TEMPEST=true 2025-09-27 00:15:29.000682 | orchestrator | ++ TEMPEST=true 2025-09-27 00:15:29.000696 | orchestrator | ++ export IS_ZUUL=true 2025-09-27 00:15:29.000707 | orchestrator | ++ IS_ZUUL=true 2025-09-27 00:15:29.000718 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.20 2025-09-27 00:15:29.000729 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.20 2025-09-27 00:15:29.000740 | orchestrator | ++ export EXTERNAL_API=false 2025-09-27 00:15:29.000751 | orchestrator | ++ EXTERNAL_API=false 2025-09-27 00:15:29.000762 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-09-27 00:15:29.000772 | orchestrator | ++ IMAGE_USER=ubuntu 2025-09-27 00:15:29.000867 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-09-27 00:15:29.000882 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-09-27 00:15:29.000893 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-09-27 00:15:29.000905 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-09-27 00:15:29.000916 | orchestrator | + sudo ln -sf /opt/configuration/contrib/semver2.sh /usr/local/bin/semver 2025-09-27 00:15:29.058429 | orchestrator | + docker version 2025-09-27 00:15:29.329109 | orchestrator | Client: Docker Engine - Community 2025-09-27 00:15:29.329200 | orchestrator | Version: 27.5.1 2025-09-27 00:15:29.329213 | orchestrator | API version: 1.47 2025-09-27 00:15:29.329225 | orchestrator | Go version: go1.22.11 2025-09-27 00:15:29.329235 | orchestrator | Git commit: 9f9e405 2025-09-27 00:15:29.329245 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2025-09-27 00:15:29.329256 | orchestrator | OS/Arch: linux/amd64 2025-09-27 00:15:29.329309 | orchestrator | Context: default 2025-09-27 00:15:29.329319 | orchestrator | 2025-09-27 00:15:29.329329 | orchestrator | Server: Docker Engine - Community 2025-09-27 00:15:29.329339 | orchestrator | Engine: 2025-09-27 00:15:29.329349 | orchestrator | Version: 27.5.1 2025-09-27 00:15:29.329359 | orchestrator | API version: 1.47 (minimum version 1.24) 2025-09-27 00:15:29.329397 | orchestrator | Go version: go1.22.11 2025-09-27 00:15:29.329407 | orchestrator | Git commit: 4c9b3b0 2025-09-27 00:15:29.329416 | orchestrator | Built: Wed Jan 22 13:41:48 2025 2025-09-27 00:15:29.329426 | orchestrator | OS/Arch: linux/amd64 2025-09-27 00:15:29.329436 | orchestrator | Experimental: false 2025-09-27 00:15:29.329445 | orchestrator | containerd: 2025-09-27 00:15:29.329455 | orchestrator | Version: v1.7.28 2025-09-27 00:15:29.329465 | orchestrator | GitCommit: b98a3aace656320842a23f4a392a33f46af97866 2025-09-27 00:15:29.329475 | orchestrator | runc: 2025-09-27 00:15:29.329484 | orchestrator | Version: 1.3.0 2025-09-27 00:15:29.329494 | orchestrator | GitCommit: v1.3.0-0-g4ca628d1 2025-09-27 00:15:29.329504 | orchestrator | docker-init: 2025-09-27 00:15:29.329513 | orchestrator | Version: 0.19.0 2025-09-27 00:15:29.329524 | orchestrator | GitCommit: de40ad0 2025-09-27 00:15:29.333693 | orchestrator | + sh -c /opt/configuration/scripts/deploy/000-manager.sh 2025-09-27 00:15:29.344050 | orchestrator | + set -e 2025-09-27 00:15:29.344066 | orchestrator | + source /opt/manager-vars.sh 2025-09-27 00:15:29.344078 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-09-27 00:15:29.344088 | orchestrator | ++ NUMBER_OF_NODES=6 2025-09-27 00:15:29.344097 | orchestrator | ++ export CEPH_VERSION=reef 2025-09-27 00:15:29.344107 | orchestrator | ++ CEPH_VERSION=reef 2025-09-27 00:15:29.344117 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-09-27 00:15:29.344127 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-09-27 00:15:29.344136 | orchestrator | ++ export MANAGER_VERSION=latest 2025-09-27 00:15:29.344145 | orchestrator | ++ MANAGER_VERSION=latest 2025-09-27 00:15:29.344155 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2025-09-27 00:15:29.344164 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2025-09-27 00:15:29.344174 | orchestrator | ++ export ARA=false 2025-09-27 00:15:29.344183 | orchestrator | ++ ARA=false 2025-09-27 00:15:29.344192 | orchestrator | ++ export DEPLOY_MODE=manager 2025-09-27 00:15:29.344202 | orchestrator | ++ DEPLOY_MODE=manager 2025-09-27 00:15:29.344212 | orchestrator | ++ export TEMPEST=true 2025-09-27 00:15:29.344221 | orchestrator | ++ TEMPEST=true 2025-09-27 00:15:29.344230 | orchestrator | ++ export IS_ZUUL=true 2025-09-27 00:15:29.344240 | orchestrator | ++ IS_ZUUL=true 2025-09-27 00:15:29.344249 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.20 2025-09-27 00:15:29.344259 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.20 2025-09-27 00:15:29.344290 | orchestrator | ++ export EXTERNAL_API=false 2025-09-27 00:15:29.344300 | orchestrator | ++ EXTERNAL_API=false 2025-09-27 00:15:29.344309 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-09-27 00:15:29.344319 | orchestrator | ++ IMAGE_USER=ubuntu 2025-09-27 00:15:29.344328 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-09-27 00:15:29.344338 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-09-27 00:15:29.344347 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-09-27 00:15:29.344357 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-09-27 00:15:29.344366 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-09-27 00:15:29.344376 | orchestrator | ++ export INTERACTIVE=false 2025-09-27 00:15:29.344385 | orchestrator | ++ INTERACTIVE=false 2025-09-27 00:15:29.344395 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-09-27 00:15:29.344408 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-09-27 00:15:29.344619 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2025-09-27 00:15:29.344632 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-27 00:15:29.344642 | orchestrator | + /opt/configuration/scripts/set-ceph-version.sh reef 2025-09-27 00:15:29.352604 | orchestrator | + set -e 2025-09-27 00:15:29.352622 | orchestrator | + VERSION=reef 2025-09-27 00:15:29.353727 | orchestrator | ++ grep '^ceph_version:' /opt/configuration/environments/manager/configuration.yml 2025-09-27 00:15:29.359791 | orchestrator | + [[ -n ceph_version: reef ]] 2025-09-27 00:15:29.359810 | orchestrator | + sed -i 's/ceph_version: .*/ceph_version: reef/g' /opt/configuration/environments/manager/configuration.yml 2025-09-27 00:15:29.366491 | orchestrator | + /opt/configuration/scripts/set-openstack-version.sh 2024.2 2025-09-27 00:15:29.373217 | orchestrator | + set -e 2025-09-27 00:15:29.373235 | orchestrator | + VERSION=2024.2 2025-09-27 00:15:29.374332 | orchestrator | ++ grep '^openstack_version:' /opt/configuration/environments/manager/configuration.yml 2025-09-27 00:15:29.377587 | orchestrator | + [[ -n openstack_version: 2024.2 ]] 2025-09-27 00:15:29.377631 | orchestrator | + sed -i 's/openstack_version: .*/openstack_version: 2024.2/g' /opt/configuration/environments/manager/configuration.yml 2025-09-27 00:15:29.384075 | orchestrator | + [[ ceph-ansible == \r\o\o\k ]] 2025-09-27 00:15:29.384866 | orchestrator | ++ semver latest 7.0.0 2025-09-27 00:15:29.453123 | orchestrator | + [[ -1 -ge 0 ]] 2025-09-27 00:15:29.453167 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-27 00:15:29.453179 | orchestrator | + echo 'enable_osism_kubernetes: true' 2025-09-27 00:15:29.453190 | orchestrator | + /opt/configuration/scripts/enable-resource-nodes.sh 2025-09-27 00:15:29.550311 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-09-27 00:15:29.551720 | orchestrator | + source /opt/venv/bin/activate 2025-09-27 00:15:29.552883 | orchestrator | ++ deactivate nondestructive 2025-09-27 00:15:29.552900 | orchestrator | ++ '[' -n '' ']' 2025-09-27 00:15:29.552912 | orchestrator | ++ '[' -n '' ']' 2025-09-27 00:15:29.552922 | orchestrator | ++ hash -r 2025-09-27 00:15:29.553200 | orchestrator | ++ '[' -n '' ']' 2025-09-27 00:15:29.553213 | orchestrator | ++ unset VIRTUAL_ENV 2025-09-27 00:15:29.553222 | orchestrator | ++ unset VIRTUAL_ENV_PROMPT 2025-09-27 00:15:29.553232 | orchestrator | ++ '[' '!' nondestructive = nondestructive ']' 2025-09-27 00:15:29.553247 | orchestrator | ++ '[' linux-gnu = cygwin ']' 2025-09-27 00:15:29.553258 | orchestrator | ++ '[' linux-gnu = msys ']' 2025-09-27 00:15:29.553284 | orchestrator | ++ export VIRTUAL_ENV=/opt/venv 2025-09-27 00:15:29.553294 | orchestrator | ++ VIRTUAL_ENV=/opt/venv 2025-09-27 00:15:29.553304 | orchestrator | ++ _OLD_VIRTUAL_PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-09-27 00:15:29.553317 | orchestrator | ++ PATH=/opt/venv/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-09-27 00:15:29.553327 | orchestrator | ++ export PATH 2025-09-27 00:15:29.553339 | orchestrator | ++ '[' -n '' ']' 2025-09-27 00:15:29.553375 | orchestrator | ++ '[' -z '' ']' 2025-09-27 00:15:29.553386 | orchestrator | ++ _OLD_VIRTUAL_PS1= 2025-09-27 00:15:29.553454 | orchestrator | ++ PS1='(venv) ' 2025-09-27 00:15:29.553467 | orchestrator | ++ export PS1 2025-09-27 00:15:29.553477 | orchestrator | ++ VIRTUAL_ENV_PROMPT='(venv) ' 2025-09-27 00:15:29.553486 | orchestrator | ++ export VIRTUAL_ENV_PROMPT 2025-09-27 00:15:29.553499 | orchestrator | ++ hash -r 2025-09-27 00:15:29.553782 | orchestrator | + ansible-playbook -i testbed-manager, --vault-password-file /opt/configuration/environments/.vault_pass /opt/configuration/ansible/manager-part-3.yml 2025-09-27 00:15:30.832985 | orchestrator | 2025-09-27 00:15:30.833104 | orchestrator | PLAY [Copy custom facts] ******************************************************* 2025-09-27 00:15:30.833121 | orchestrator | 2025-09-27 00:15:30.833132 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-09-27 00:15:31.421779 | orchestrator | ok: [testbed-manager] 2025-09-27 00:15:31.421884 | orchestrator | 2025-09-27 00:15:31.421900 | orchestrator | TASK [Copy fact files] ********************************************************* 2025-09-27 00:15:32.381612 | orchestrator | changed: [testbed-manager] 2025-09-27 00:15:32.381714 | orchestrator | 2025-09-27 00:15:32.381729 | orchestrator | PLAY [Before the deployment of the manager] ************************************ 2025-09-27 00:15:32.381742 | orchestrator | 2025-09-27 00:15:32.381753 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:15:34.666432 | orchestrator | ok: [testbed-manager] 2025-09-27 00:15:34.666534 | orchestrator | 2025-09-27 00:15:34.666550 | orchestrator | TASK [Get /opt/manager-vars.sh] ************************************************ 2025-09-27 00:15:34.722414 | orchestrator | ok: [testbed-manager] 2025-09-27 00:15:34.722513 | orchestrator | 2025-09-27 00:15:34.722540 | orchestrator | TASK [Add ara_server_mariadb_volume_type parameter] **************************** 2025-09-27 00:15:35.170940 | orchestrator | changed: [testbed-manager] 2025-09-27 00:15:35.171051 | orchestrator | 2025-09-27 00:15:35.171066 | orchestrator | TASK [Add netbox_enable parameter] ********************************************* 2025-09-27 00:15:35.208943 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:15:35.208972 | orchestrator | 2025-09-27 00:15:35.208984 | orchestrator | TASK [Install HWE kernel package on Ubuntu] ************************************ 2025-09-27 00:15:35.547567 | orchestrator | changed: [testbed-manager] 2025-09-27 00:15:35.547637 | orchestrator | 2025-09-27 00:15:35.547650 | orchestrator | TASK [Use insecure glance configuration] *************************************** 2025-09-27 00:15:35.602656 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:15:35.602721 | orchestrator | 2025-09-27 00:15:35.602734 | orchestrator | TASK [Check if /etc/OTC_region exist] ****************************************** 2025-09-27 00:15:35.929187 | orchestrator | ok: [testbed-manager] 2025-09-27 00:15:35.929342 | orchestrator | 2025-09-27 00:15:35.929361 | orchestrator | TASK [Add nova_compute_virt_type parameter] ************************************ 2025-09-27 00:15:36.050256 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:15:36.050421 | orchestrator | 2025-09-27 00:15:36.050439 | orchestrator | PLAY [Apply role traefik] ****************************************************** 2025-09-27 00:15:36.050452 | orchestrator | 2025-09-27 00:15:36.050467 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:15:38.761728 | orchestrator | ok: [testbed-manager] 2025-09-27 00:15:38.761835 | orchestrator | 2025-09-27 00:15:38.761852 | orchestrator | TASK [Apply traefik role] ****************************************************** 2025-09-27 00:15:38.869964 | orchestrator | included: osism.services.traefik for testbed-manager 2025-09-27 00:15:38.870077 | orchestrator | 2025-09-27 00:15:38.870095 | orchestrator | TASK [osism.services.traefik : Include config tasks] *************************** 2025-09-27 00:15:38.922393 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/config.yml for testbed-manager 2025-09-27 00:15:38.922447 | orchestrator | 2025-09-27 00:15:38.922459 | orchestrator | TASK [osism.services.traefik : Create required directories] ******************** 2025-09-27 00:15:40.016436 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik) 2025-09-27 00:15:40.016536 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/certificates) 2025-09-27 00:15:40.016551 | orchestrator | changed: [testbed-manager] => (item=/opt/traefik/configuration) 2025-09-27 00:15:40.016564 | orchestrator | 2025-09-27 00:15:40.016577 | orchestrator | TASK [osism.services.traefik : Copy configuration files] *********************** 2025-09-27 00:15:41.817886 | orchestrator | changed: [testbed-manager] => (item=traefik.yml) 2025-09-27 00:15:41.817989 | orchestrator | changed: [testbed-manager] => (item=traefik.env) 2025-09-27 00:15:41.818005 | orchestrator | changed: [testbed-manager] => (item=certificates.yml) 2025-09-27 00:15:41.818080 | orchestrator | 2025-09-27 00:15:41.818094 | orchestrator | TASK [osism.services.traefik : Copy certificate cert files] ******************** 2025-09-27 00:15:42.460529 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-27 00:15:42.460615 | orchestrator | changed: [testbed-manager] 2025-09-27 00:15:42.460628 | orchestrator | 2025-09-27 00:15:42.460640 | orchestrator | TASK [osism.services.traefik : Copy certificate key files] ********************* 2025-09-27 00:15:43.095824 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-27 00:15:43.095914 | orchestrator | changed: [testbed-manager] 2025-09-27 00:15:43.095928 | orchestrator | 2025-09-27 00:15:43.095940 | orchestrator | TASK [osism.services.traefik : Copy dynamic configuration] ********************* 2025-09-27 00:15:43.153226 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:15:43.153328 | orchestrator | 2025-09-27 00:15:43.153343 | orchestrator | TASK [osism.services.traefik : Remove dynamic configuration] ******************* 2025-09-27 00:15:43.533554 | orchestrator | ok: [testbed-manager] 2025-09-27 00:15:43.533635 | orchestrator | 2025-09-27 00:15:43.533647 | orchestrator | TASK [osism.services.traefik : Include service tasks] ************************** 2025-09-27 00:15:43.595077 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/traefik/tasks/service.yml for testbed-manager 2025-09-27 00:15:43.595130 | orchestrator | 2025-09-27 00:15:43.595145 | orchestrator | TASK [osism.services.traefik : Create traefik external network] **************** 2025-09-27 00:15:44.670711 | orchestrator | changed: [testbed-manager] 2025-09-27 00:15:44.670809 | orchestrator | 2025-09-27 00:15:44.670825 | orchestrator | TASK [osism.services.traefik : Copy docker-compose.yml file] ******************* 2025-09-27 00:15:45.499017 | orchestrator | changed: [testbed-manager] 2025-09-27 00:15:45.499100 | orchestrator | 2025-09-27 00:15:45.499113 | orchestrator | TASK [osism.services.traefik : Manage traefik service] ************************* 2025-09-27 00:15:56.928513 | orchestrator | changed: [testbed-manager] 2025-09-27 00:15:56.928623 | orchestrator | 2025-09-27 00:15:56.928640 | orchestrator | RUNNING HANDLER [osism.services.traefik : Restart traefik service] ************* 2025-09-27 00:15:56.979534 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:15:56.979581 | orchestrator | 2025-09-27 00:15:56.979594 | orchestrator | PLAY [Deploy manager service] ************************************************** 2025-09-27 00:15:56.979606 | orchestrator | 2025-09-27 00:15:56.979618 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:15:58.889457 | orchestrator | ok: [testbed-manager] 2025-09-27 00:15:58.889565 | orchestrator | 2025-09-27 00:15:58.889603 | orchestrator | TASK [Apply manager role] ****************************************************** 2025-09-27 00:15:59.008650 | orchestrator | included: osism.services.manager for testbed-manager 2025-09-27 00:15:59.008715 | orchestrator | 2025-09-27 00:15:59.008728 | orchestrator | TASK [osism.services.manager : Include install tasks] ************************** 2025-09-27 00:15:59.067979 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/install-Debian-family.yml for testbed-manager 2025-09-27 00:15:59.068021 | orchestrator | 2025-09-27 00:15:59.068035 | orchestrator | TASK [osism.services.manager : Install required packages] ********************** 2025-09-27 00:16:01.651627 | orchestrator | ok: [testbed-manager] 2025-09-27 00:16:01.651721 | orchestrator | 2025-09-27 00:16:01.651737 | orchestrator | TASK [osism.services.manager : Gather variables for each operating system] ***** 2025-09-27 00:16:01.707312 | orchestrator | ok: [testbed-manager] 2025-09-27 00:16:01.707341 | orchestrator | 2025-09-27 00:16:01.707354 | orchestrator | TASK [osism.services.manager : Include config tasks] *************************** 2025-09-27 00:16:01.839557 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config.yml for testbed-manager 2025-09-27 00:16:01.839608 | orchestrator | 2025-09-27 00:16:01.839620 | orchestrator | TASK [osism.services.manager : Create required directories] ******************** 2025-09-27 00:16:04.712056 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible) 2025-09-27 00:16:04.712156 | orchestrator | changed: [testbed-manager] => (item=/opt/archive) 2025-09-27 00:16:04.712171 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/configuration) 2025-09-27 00:16:04.712183 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/data) 2025-09-27 00:16:04.712195 | orchestrator | ok: [testbed-manager] => (item=/opt/manager) 2025-09-27 00:16:04.712205 | orchestrator | changed: [testbed-manager] => (item=/opt/manager/secrets) 2025-09-27 00:16:04.712216 | orchestrator | changed: [testbed-manager] => (item=/opt/ansible/secrets) 2025-09-27 00:16:04.712227 | orchestrator | changed: [testbed-manager] => (item=/opt/state) 2025-09-27 00:16:04.712239 | orchestrator | 2025-09-27 00:16:04.712251 | orchestrator | TASK [osism.services.manager : Copy all environment file] ********************** 2025-09-27 00:16:05.337496 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:05.337596 | orchestrator | 2025-09-27 00:16:05.337613 | orchestrator | TASK [osism.services.manager : Copy client environment file] ******************* 2025-09-27 00:16:05.958716 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:05.958812 | orchestrator | 2025-09-27 00:16:05.958828 | orchestrator | TASK [osism.services.manager : Include ara config tasks] *********************** 2025-09-27 00:16:06.035788 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ara.yml for testbed-manager 2025-09-27 00:16:06.035852 | orchestrator | 2025-09-27 00:16:06.035866 | orchestrator | TASK [osism.services.manager : Copy ARA environment files] ********************* 2025-09-27 00:16:07.279342 | orchestrator | changed: [testbed-manager] => (item=ara) 2025-09-27 00:16:07.279445 | orchestrator | changed: [testbed-manager] => (item=ara-server) 2025-09-27 00:16:07.279460 | orchestrator | 2025-09-27 00:16:07.279473 | orchestrator | TASK [osism.services.manager : Copy MariaDB environment file] ****************** 2025-09-27 00:16:07.875921 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:07.876018 | orchestrator | 2025-09-27 00:16:07.876034 | orchestrator | TASK [osism.services.manager : Include vault config tasks] ********************* 2025-09-27 00:16:07.932844 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:16:07.932910 | orchestrator | 2025-09-27 00:16:07.932924 | orchestrator | TASK [osism.services.manager : Include frontend config tasks] ****************** 2025-09-27 00:16:08.002425 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-frontend.yml for testbed-manager 2025-09-27 00:16:08.002506 | orchestrator | 2025-09-27 00:16:08.002523 | orchestrator | TASK [osism.services.manager : Copy frontend environment file] ***************** 2025-09-27 00:16:08.630398 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:08.630484 | orchestrator | 2025-09-27 00:16:08.630498 | orchestrator | TASK [osism.services.manager : Include ansible config tasks] ******************* 2025-09-27 00:16:08.695098 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-ansible.yml for testbed-manager 2025-09-27 00:16:08.695200 | orchestrator | 2025-09-27 00:16:08.695213 | orchestrator | TASK [osism.services.manager : Copy private ssh keys] ************************** 2025-09-27 00:16:10.049246 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-27 00:16:10.049353 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-27 00:16:10.049365 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:10.049376 | orchestrator | 2025-09-27 00:16:10.049387 | orchestrator | TASK [osism.services.manager : Copy ansible environment file] ****************** 2025-09-27 00:16:10.713132 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:10.713225 | orchestrator | 2025-09-27 00:16:10.713240 | orchestrator | TASK [osism.services.manager : Include netbox config tasks] ******************** 2025-09-27 00:16:10.779145 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:16:10.779217 | orchestrator | 2025-09-27 00:16:10.779231 | orchestrator | TASK [osism.services.manager : Include celery config tasks] ******************** 2025-09-27 00:16:10.860223 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-celery.yml for testbed-manager 2025-09-27 00:16:10.860331 | orchestrator | 2025-09-27 00:16:10.860345 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_watches] **************** 2025-09-27 00:16:11.386967 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:11.387063 | orchestrator | 2025-09-27 00:16:11.387077 | orchestrator | TASK [osism.services.manager : Set fs.inotify.max_user_instances] ************** 2025-09-27 00:16:11.770697 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:11.770778 | orchestrator | 2025-09-27 00:16:11.770791 | orchestrator | TASK [osism.services.manager : Copy celery environment files] ****************** 2025-09-27 00:16:12.991720 | orchestrator | changed: [testbed-manager] => (item=conductor) 2025-09-27 00:16:12.991801 | orchestrator | changed: [testbed-manager] => (item=openstack) 2025-09-27 00:16:12.991813 | orchestrator | 2025-09-27 00:16:12.991824 | orchestrator | TASK [osism.services.manager : Copy listener environment file] ***************** 2025-09-27 00:16:13.622308 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:13.622413 | orchestrator | 2025-09-27 00:16:13.622430 | orchestrator | TASK [osism.services.manager : Check for conductor.yml] ************************ 2025-09-27 00:16:14.019913 | orchestrator | ok: [testbed-manager] 2025-09-27 00:16:14.020006 | orchestrator | 2025-09-27 00:16:14.020021 | orchestrator | TASK [osism.services.manager : Copy conductor configuration file] ************** 2025-09-27 00:16:14.373771 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:14.373873 | orchestrator | 2025-09-27 00:16:14.373890 | orchestrator | TASK [osism.services.manager : Copy empty conductor configuration file] ******** 2025-09-27 00:16:14.420562 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:16:14.420596 | orchestrator | 2025-09-27 00:16:14.420608 | orchestrator | TASK [osism.services.manager : Include wrapper config tasks] ******************* 2025-09-27 00:16:14.493869 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-wrapper.yml for testbed-manager 2025-09-27 00:16:14.493926 | orchestrator | 2025-09-27 00:16:14.493942 | orchestrator | TASK [osism.services.manager : Include wrapper vars file] ********************** 2025-09-27 00:16:14.529071 | orchestrator | ok: [testbed-manager] 2025-09-27 00:16:14.529101 | orchestrator | 2025-09-27 00:16:14.529113 | orchestrator | TASK [osism.services.manager : Copy wrapper scripts] *************************** 2025-09-27 00:16:16.556815 | orchestrator | changed: [testbed-manager] => (item=osism) 2025-09-27 00:16:16.556909 | orchestrator | changed: [testbed-manager] => (item=osism-update-docker) 2025-09-27 00:16:16.556924 | orchestrator | changed: [testbed-manager] => (item=osism-update-manager) 2025-09-27 00:16:16.556936 | orchestrator | 2025-09-27 00:16:16.556949 | orchestrator | TASK [osism.services.manager : Copy cilium wrapper script] ********************* 2025-09-27 00:16:17.287668 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:17.287778 | orchestrator | 2025-09-27 00:16:17.287797 | orchestrator | TASK [osism.services.manager : Copy hubble wrapper script] ********************* 2025-09-27 00:16:18.064872 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:18.064995 | orchestrator | 2025-09-27 00:16:18.065014 | orchestrator | TASK [osism.services.manager : Copy flux wrapper script] *********************** 2025-09-27 00:16:18.784136 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:18.784196 | orchestrator | 2025-09-27 00:16:18.784209 | orchestrator | TASK [osism.services.manager : Include scripts config tasks] ******************* 2025-09-27 00:16:18.851011 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/config-scripts.yml for testbed-manager 2025-09-27 00:16:18.851043 | orchestrator | 2025-09-27 00:16:18.851055 | orchestrator | TASK [osism.services.manager : Include scripts vars file] ********************** 2025-09-27 00:16:18.898492 | orchestrator | ok: [testbed-manager] 2025-09-27 00:16:18.898530 | orchestrator | 2025-09-27 00:16:18.898541 | orchestrator | TASK [osism.services.manager : Copy scripts] *********************************** 2025-09-27 00:16:19.660722 | orchestrator | changed: [testbed-manager] => (item=osism-include) 2025-09-27 00:16:19.660809 | orchestrator | 2025-09-27 00:16:19.660824 | orchestrator | TASK [osism.services.manager : Include service tasks] ************************** 2025-09-27 00:16:19.740956 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/service.yml for testbed-manager 2025-09-27 00:16:19.741020 | orchestrator | 2025-09-27 00:16:19.741033 | orchestrator | TASK [osism.services.manager : Copy manager systemd unit file] ***************** 2025-09-27 00:16:20.456160 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:20.456254 | orchestrator | 2025-09-27 00:16:20.456306 | orchestrator | TASK [osism.services.manager : Create traefik external network] **************** 2025-09-27 00:16:21.036024 | orchestrator | ok: [testbed-manager] 2025-09-27 00:16:21.036068 | orchestrator | 2025-09-27 00:16:21.036079 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb < 11.0.0] *** 2025-09-27 00:16:21.095398 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:16:21.095484 | orchestrator | 2025-09-27 00:16:21.095501 | orchestrator | TASK [osism.services.manager : Set mariadb healthcheck for mariadb >= 11.0.0] *** 2025-09-27 00:16:21.152014 | orchestrator | ok: [testbed-manager] 2025-09-27 00:16:21.152060 | orchestrator | 2025-09-27 00:16:21.152073 | orchestrator | TASK [osism.services.manager : Copy docker-compose.yml file] ******************* 2025-09-27 00:16:22.011941 | orchestrator | changed: [testbed-manager] 2025-09-27 00:16:22.012030 | orchestrator | 2025-09-27 00:16:22.012044 | orchestrator | TASK [osism.services.manager : Pull container images] ************************** 2025-09-27 00:17:29.394349 | orchestrator | changed: [testbed-manager] 2025-09-27 00:17:29.394473 | orchestrator | 2025-09-27 00:17:29.394492 | orchestrator | TASK [osism.services.manager : Stop and disable old service docker-compose@manager] *** 2025-09-27 00:17:30.410829 | orchestrator | ok: [testbed-manager] 2025-09-27 00:17:30.410928 | orchestrator | 2025-09-27 00:17:30.410942 | orchestrator | TASK [osism.services.manager : Do a manual start of the manager service] ******* 2025-09-27 00:17:30.548315 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:17:30.548384 | orchestrator | 2025-09-27 00:17:30.548399 | orchestrator | TASK [osism.services.manager : Manage manager service] ************************* 2025-09-27 00:17:32.854127 | orchestrator | changed: [testbed-manager] 2025-09-27 00:17:32.854283 | orchestrator | 2025-09-27 00:17:32.854302 | orchestrator | TASK [osism.services.manager : Register that manager service was started] ****** 2025-09-27 00:17:32.915450 | orchestrator | ok: [testbed-manager] 2025-09-27 00:17:32.915501 | orchestrator | 2025-09-27 00:17:32.915516 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2025-09-27 00:17:32.915529 | orchestrator | 2025-09-27 00:17:32.915541 | orchestrator | RUNNING HANDLER [osism.services.manager : Restart manager service] ************* 2025-09-27 00:17:32.958453 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:17:32.958513 | orchestrator | 2025-09-27 00:17:32.958526 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for manager service to start] *** 2025-09-27 00:18:33.015620 | orchestrator | Pausing for 60 seconds 2025-09-27 00:18:33.015736 | orchestrator | changed: [testbed-manager] 2025-09-27 00:18:33.015750 | orchestrator | 2025-09-27 00:18:33.015762 | orchestrator | RUNNING HANDLER [osism.services.manager : Ensure that all containers are up] *** 2025-09-27 00:18:38.126655 | orchestrator | changed: [testbed-manager] 2025-09-27 00:18:38.126780 | orchestrator | 2025-09-27 00:18:38.126798 | orchestrator | RUNNING HANDLER [osism.services.manager : Wait for an healthy manager service] *** 2025-09-27 00:19:19.672478 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (50 retries left). 2025-09-27 00:19:19.672594 | orchestrator | FAILED - RETRYING: [testbed-manager]: Wait for an healthy manager service (49 retries left). 2025-09-27 00:19:19.672610 | orchestrator | changed: [testbed-manager] 2025-09-27 00:19:19.672651 | orchestrator | 2025-09-27 00:19:19.672663 | orchestrator | RUNNING HANDLER [osism.services.manager : Copy osismclient bash completion script] *** 2025-09-27 00:19:29.322529 | orchestrator | changed: [testbed-manager] 2025-09-27 00:19:29.322636 | orchestrator | 2025-09-27 00:19:29.322651 | orchestrator | TASK [osism.services.manager : Include initialize tasks] *********************** 2025-09-27 00:19:29.399391 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/initialize.yml for testbed-manager 2025-09-27 00:19:29.399444 | orchestrator | 2025-09-27 00:19:29.399459 | orchestrator | TASK [osism.services.manager : Flush handlers] ********************************* 2025-09-27 00:19:29.399471 | orchestrator | 2025-09-27 00:19:29.399481 | orchestrator | TASK [osism.services.manager : Include vault initialize tasks] ***************** 2025-09-27 00:19:29.443595 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:19:29.443637 | orchestrator | 2025-09-27 00:19:29.443654 | orchestrator | TASK [osism.services.manager : Include version verification tasks] ************* 2025-09-27 00:19:29.505719 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/manager/tasks/verify-versions.yml for testbed-manager 2025-09-27 00:19:29.505745 | orchestrator | 2025-09-27 00:19:29.505757 | orchestrator | TASK [osism.services.manager : Deploy service manager version check script] **** 2025-09-27 00:19:30.260093 | orchestrator | changed: [testbed-manager] 2025-09-27 00:19:30.260213 | orchestrator | 2025-09-27 00:19:30.260270 | orchestrator | TASK [osism.services.manager : Execute service manager version check] ********** 2025-09-27 00:19:34.099994 | orchestrator | ok: [testbed-manager] 2025-09-27 00:19:34.100100 | orchestrator | 2025-09-27 00:19:34.100116 | orchestrator | TASK [osism.services.manager : Display version check results] ****************** 2025-09-27 00:19:34.181269 | orchestrator | ok: [testbed-manager] => { 2025-09-27 00:19:34.181363 | orchestrator | "version_check_result.stdout_lines": [ 2025-09-27 00:19:34.181380 | orchestrator | "=== OSISM Container Version Check ===", 2025-09-27 00:19:34.181392 | orchestrator | "Checking running containers against expected versions...", 2025-09-27 00:19:34.181404 | orchestrator | "", 2025-09-27 00:19:34.181417 | orchestrator | "Checking service: inventory_reconciler (Inventory Reconciler Service)", 2025-09-27 00:19:34.181429 | orchestrator | " Expected: registry.osism.tech/osism/inventory-reconciler:latest", 2025-09-27 00:19:34.181440 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.181464 | orchestrator | " Running: registry.osism.tech/osism/inventory-reconciler:latest", 2025-09-27 00:19:34.181476 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.181488 | orchestrator | "", 2025-09-27 00:19:34.181499 | orchestrator | "Checking service: osism-ansible (OSISM Ansible Service)", 2025-09-27 00:19:34.181511 | orchestrator | " Expected: registry.osism.tech/osism/osism-ansible:latest", 2025-09-27 00:19:34.181522 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.181533 | orchestrator | " Running: registry.osism.tech/osism/osism-ansible:latest", 2025-09-27 00:19:34.181544 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.181555 | orchestrator | "", 2025-09-27 00:19:34.181566 | orchestrator | "Checking service: osism-kubernetes (Osism-Kubernetes Service)", 2025-09-27 00:19:34.181577 | orchestrator | " Expected: registry.osism.tech/osism/osism-kubernetes:latest", 2025-09-27 00:19:34.181588 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.181599 | orchestrator | " Running: registry.osism.tech/osism/osism-kubernetes:latest", 2025-09-27 00:19:34.181610 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.181621 | orchestrator | "", 2025-09-27 00:19:34.181632 | orchestrator | "Checking service: ceph-ansible (Ceph-Ansible Service)", 2025-09-27 00:19:34.181643 | orchestrator | " Expected: registry.osism.tech/osism/ceph-ansible:reef", 2025-09-27 00:19:34.181654 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.181665 | orchestrator | " Running: registry.osism.tech/osism/ceph-ansible:reef", 2025-09-27 00:19:34.181677 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.181688 | orchestrator | "", 2025-09-27 00:19:34.181699 | orchestrator | "Checking service: kolla-ansible (Kolla-Ansible Service)", 2025-09-27 00:19:34.181710 | orchestrator | " Expected: registry.osism.tech/osism/kolla-ansible:2024.2", 2025-09-27 00:19:34.181742 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.181754 | orchestrator | " Running: registry.osism.tech/osism/kolla-ansible:2024.2", 2025-09-27 00:19:34.181765 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.181775 | orchestrator | "", 2025-09-27 00:19:34.181786 | orchestrator | "Checking service: osismclient (OSISM Client)", 2025-09-27 00:19:34.181797 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.181808 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.181819 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.181830 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.181841 | orchestrator | "", 2025-09-27 00:19:34.181852 | orchestrator | "Checking service: ara-server (ARA Server)", 2025-09-27 00:19:34.181862 | orchestrator | " Expected: registry.osism.tech/osism/ara-server:1.7.3", 2025-09-27 00:19:34.181873 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.181884 | orchestrator | " Running: registry.osism.tech/osism/ara-server:1.7.3", 2025-09-27 00:19:34.181895 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.181913 | orchestrator | "", 2025-09-27 00:19:34.181932 | orchestrator | "Checking service: mariadb (MariaDB for ARA)", 2025-09-27 00:19:34.181959 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/mariadb:11.8.3", 2025-09-27 00:19:34.181978 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.181994 | orchestrator | " Running: registry.osism.tech/dockerhub/library/mariadb:11.8.3", 2025-09-27 00:19:34.182005 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.182072 | orchestrator | "", 2025-09-27 00:19:34.182086 | orchestrator | "Checking service: frontend (OSISM Frontend)", 2025-09-27 00:19:34.182097 | orchestrator | " Expected: registry.osism.tech/osism/osism-frontend:latest", 2025-09-27 00:19:34.182108 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.182123 | orchestrator | " Running: registry.osism.tech/osism/osism-frontend:latest", 2025-09-27 00:19:34.182135 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.182147 | orchestrator | "", 2025-09-27 00:19:34.182158 | orchestrator | "Checking service: redis (Redis Cache)", 2025-09-27 00:19:34.182169 | orchestrator | " Expected: registry.osism.tech/dockerhub/library/redis:7.4.5-alpine", 2025-09-27 00:19:34.182180 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.182191 | orchestrator | " Running: registry.osism.tech/dockerhub/library/redis:7.4.5-alpine", 2025-09-27 00:19:34.182202 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.182213 | orchestrator | "", 2025-09-27 00:19:34.182255 | orchestrator | "Checking service: api (OSISM API Service)", 2025-09-27 00:19:34.182266 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182277 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.182288 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182299 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.182310 | orchestrator | "", 2025-09-27 00:19:34.182320 | orchestrator | "Checking service: listener (OpenStack Event Listener)", 2025-09-27 00:19:34.182331 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182342 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.182353 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182364 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.182375 | orchestrator | "", 2025-09-27 00:19:34.182386 | orchestrator | "Checking service: openstack (OpenStack Integration)", 2025-09-27 00:19:34.182397 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182408 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.182419 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182429 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.182440 | orchestrator | "", 2025-09-27 00:19:34.182451 | orchestrator | "Checking service: beat (Celery Beat Scheduler)", 2025-09-27 00:19:34.182462 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182473 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.182484 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182495 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.182515 | orchestrator | "", 2025-09-27 00:19:34.182526 | orchestrator | "Checking service: flower (Celery Flower Monitor)", 2025-09-27 00:19:34.182555 | orchestrator | " Expected: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182566 | orchestrator | " Enabled: true", 2025-09-27 00:19:34.182577 | orchestrator | " Running: registry.osism.tech/osism/osism:latest", 2025-09-27 00:19:34.182588 | orchestrator | " Status: ✅ MATCH", 2025-09-27 00:19:34.182599 | orchestrator | "", 2025-09-27 00:19:34.182610 | orchestrator | "=== Summary ===", 2025-09-27 00:19:34.182621 | orchestrator | "Errors (version mismatches): 0", 2025-09-27 00:19:34.182631 | orchestrator | "Warnings (expected containers not running): 0", 2025-09-27 00:19:34.182642 | orchestrator | "", 2025-09-27 00:19:34.182653 | orchestrator | "✅ All running containers match expected versions!" 2025-09-27 00:19:34.182664 | orchestrator | ] 2025-09-27 00:19:34.182675 | orchestrator | } 2025-09-27 00:19:34.182686 | orchestrator | 2025-09-27 00:19:34.182698 | orchestrator | TASK [osism.services.manager : Skip version check due to service configuration] *** 2025-09-27 00:19:34.237549 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:19:34.237613 | orchestrator | 2025-09-27 00:19:34.237626 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:19:34.237640 | orchestrator | testbed-manager : ok=70 changed=37 unreachable=0 failed=0 skipped=13 rescued=0 ignored=0 2025-09-27 00:19:34.237652 | orchestrator | 2025-09-27 00:19:34.331401 | orchestrator | + [[ -e /opt/venv/bin/activate ]] 2025-09-27 00:19:34.331463 | orchestrator | + deactivate 2025-09-27 00:19:34.331476 | orchestrator | + '[' -n /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin ']' 2025-09-27 00:19:34.331487 | orchestrator | + PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin 2025-09-27 00:19:34.331497 | orchestrator | + export PATH 2025-09-27 00:19:34.331506 | orchestrator | + unset _OLD_VIRTUAL_PATH 2025-09-27 00:19:34.331516 | orchestrator | + '[' -n '' ']' 2025-09-27 00:19:34.331526 | orchestrator | + hash -r 2025-09-27 00:19:34.331535 | orchestrator | + '[' -n '' ']' 2025-09-27 00:19:34.331545 | orchestrator | + unset VIRTUAL_ENV 2025-09-27 00:19:34.331554 | orchestrator | + unset VIRTUAL_ENV_PROMPT 2025-09-27 00:19:34.331564 | orchestrator | + '[' '!' '' = nondestructive ']' 2025-09-27 00:19:34.331573 | orchestrator | + unset -f deactivate 2025-09-27 00:19:34.331583 | orchestrator | + cp /home/dragon/.ssh/id_rsa.pub /opt/ansible/secrets/id_rsa.operator.pub 2025-09-27 00:19:34.339162 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2025-09-27 00:19:34.339182 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2025-09-27 00:19:34.339192 | orchestrator | + local max_attempts=60 2025-09-27 00:19:34.339201 | orchestrator | + local name=ceph-ansible 2025-09-27 00:19:34.339210 | orchestrator | + local attempt_num=1 2025-09-27 00:19:34.339976 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:19:34.377968 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:19:34.377996 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2025-09-27 00:19:34.378006 | orchestrator | + local max_attempts=60 2025-09-27 00:19:34.378064 | orchestrator | + local name=kolla-ansible 2025-09-27 00:19:34.378075 | orchestrator | + local attempt_num=1 2025-09-27 00:19:34.379029 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2025-09-27 00:19:34.421335 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:19:34.421357 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2025-09-27 00:19:34.421367 | orchestrator | + local max_attempts=60 2025-09-27 00:19:34.421376 | orchestrator | + local name=osism-ansible 2025-09-27 00:19:34.421386 | orchestrator | + local attempt_num=1 2025-09-27 00:19:34.422472 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2025-09-27 00:19:34.468091 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:19:34.468140 | orchestrator | + [[ true == \t\r\u\e ]] 2025-09-27 00:19:34.468150 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2025-09-27 00:19:35.092948 | orchestrator | + docker compose --project-directory /opt/manager ps 2025-09-27 00:19:35.268200 | orchestrator | NAME IMAGE COMMAND SERVICE CREATED STATUS PORTS 2025-09-27 00:19:35.268335 | orchestrator | ceph-ansible registry.osism.tech/osism/ceph-ansible:reef "/entrypoint.sh osis…" ceph-ansible 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.268386 | orchestrator | kolla-ansible registry.osism.tech/osism/kolla-ansible:2024.2 "/entrypoint.sh osis…" kolla-ansible 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.269067 | orchestrator | manager-api-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" api 2 minutes ago Up About a minute (healthy) 192.168.16.5:8000->8000/tcp 2025-09-27 00:19:35.269085 | orchestrator | manager-ara-server-1 registry.osism.tech/osism/ara-server:1.7.3 "sh -c '/wait && /ru…" ara-server 2 minutes ago Up About a minute (healthy) 8000/tcp 2025-09-27 00:19:35.269098 | orchestrator | manager-beat-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" beat 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.269110 | orchestrator | manager-flower-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" flower 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.269136 | orchestrator | manager-inventory_reconciler-1 registry.osism.tech/osism/inventory-reconciler:latest "/sbin/tini -- /entr…" inventory_reconciler 2 minutes ago Up 57 seconds (healthy) 2025-09-27 00:19:35.269148 | orchestrator | manager-listener-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" listener 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.269160 | orchestrator | manager-mariadb-1 registry.osism.tech/dockerhub/library/mariadb:11.8.3 "docker-entrypoint.s…" mariadb 2 minutes ago Up About a minute (healthy) 3306/tcp 2025-09-27 00:19:35.269171 | orchestrator | manager-openstack-1 registry.osism.tech/osism/osism:latest "/sbin/tini -- osism…" openstack 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.269183 | orchestrator | manager-redis-1 registry.osism.tech/dockerhub/library/redis:7.4.5-alpine "docker-entrypoint.s…" redis 2 minutes ago Up About a minute (healthy) 6379/tcp 2025-09-27 00:19:35.269193 | orchestrator | osism-ansible registry.osism.tech/osism/osism-ansible:latest "/entrypoint.sh osis…" osism-ansible 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.269202 | orchestrator | osism-frontend registry.osism.tech/osism/osism-frontend:latest "docker-entrypoint.s…" frontend 2 minutes ago Up About a minute 192.168.16.5:3000->3000/tcp 2025-09-27 00:19:35.269212 | orchestrator | osism-kubernetes registry.osism.tech/osism/osism-kubernetes:latest "/entrypoint.sh osis…" osism-kubernetes 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.269238 | orchestrator | osismclient registry.osism.tech/osism/osism:latest "/sbin/tini -- sleep…" osismclient 2 minutes ago Up About a minute (healthy) 2025-09-27 00:19:35.275433 | orchestrator | ++ semver latest 7.0.0 2025-09-27 00:19:35.322153 | orchestrator | + [[ -1 -ge 0 ]] 2025-09-27 00:19:35.322207 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-27 00:19:35.322239 | orchestrator | + sed -i s/community.general.yaml/osism.commons.still_alive/ /opt/configuration/environments/ansible.cfg 2025-09-27 00:19:35.326366 | orchestrator | + osism apply resolvconf -l testbed-manager 2025-09-27 00:19:47.521063 | orchestrator | 2025-09-27 00:19:47 | INFO  | Task 41f06a48-2cdf-47ee-b38b-01a8f2c9d036 (resolvconf) was prepared for execution. 2025-09-27 00:19:47.521178 | orchestrator | 2025-09-27 00:19:47 | INFO  | It takes a moment until task 41f06a48-2cdf-47ee-b38b-01a8f2c9d036 (resolvconf) has been started and output is visible here. 2025-09-27 00:20:00.551973 | orchestrator | 2025-09-27 00:20:00.552103 | orchestrator | PLAY [Apply role resolvconf] *************************************************** 2025-09-27 00:20:00.552119 | orchestrator | 2025-09-27 00:20:00.552131 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:20:00.552143 | orchestrator | Saturday 27 September 2025 00:19:51 +0000 (0:00:00.145) 0:00:00.145 **** 2025-09-27 00:20:00.552154 | orchestrator | ok: [testbed-manager] 2025-09-27 00:20:00.552166 | orchestrator | 2025-09-27 00:20:00.552177 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2025-09-27 00:20:00.552188 | orchestrator | Saturday 27 September 2025 00:19:54 +0000 (0:00:03.553) 0:00:03.698 **** 2025-09-27 00:20:00.552200 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:20:00.552211 | orchestrator | 2025-09-27 00:20:00.552271 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2025-09-27 00:20:00.552283 | orchestrator | Saturday 27 September 2025 00:19:54 +0000 (0:00:00.064) 0:00:03.762 **** 2025-09-27 00:20:00.552294 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager 2025-09-27 00:20:00.552306 | orchestrator | 2025-09-27 00:20:00.552317 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2025-09-27 00:20:00.552328 | orchestrator | Saturday 27 September 2025 00:19:55 +0000 (0:00:00.079) 0:00:03.842 **** 2025-09-27 00:20:00.552350 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager 2025-09-27 00:20:00.552362 | orchestrator | 2025-09-27 00:20:00.552373 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2025-09-27 00:20:00.552384 | orchestrator | Saturday 27 September 2025 00:19:55 +0000 (0:00:00.072) 0:00:03.915 **** 2025-09-27 00:20:00.552395 | orchestrator | ok: [testbed-manager] 2025-09-27 00:20:00.552406 | orchestrator | 2025-09-27 00:20:00.552417 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2025-09-27 00:20:00.552428 | orchestrator | Saturday 27 September 2025 00:19:56 +0000 (0:00:00.981) 0:00:04.896 **** 2025-09-27 00:20:00.552439 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:20:00.552450 | orchestrator | 2025-09-27 00:20:00.552462 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2025-09-27 00:20:00.552473 | orchestrator | Saturday 27 September 2025 00:19:56 +0000 (0:00:00.050) 0:00:04.946 **** 2025-09-27 00:20:00.552483 | orchestrator | ok: [testbed-manager] 2025-09-27 00:20:00.552494 | orchestrator | 2025-09-27 00:20:00.552505 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2025-09-27 00:20:00.552516 | orchestrator | Saturday 27 September 2025 00:19:56 +0000 (0:00:00.458) 0:00:05.405 **** 2025-09-27 00:20:00.552527 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:20:00.552538 | orchestrator | 2025-09-27 00:20:00.552548 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2025-09-27 00:20:00.552561 | orchestrator | Saturday 27 September 2025 00:19:56 +0000 (0:00:00.077) 0:00:05.482 **** 2025-09-27 00:20:00.552571 | orchestrator | changed: [testbed-manager] 2025-09-27 00:20:00.552582 | orchestrator | 2025-09-27 00:20:00.552593 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2025-09-27 00:20:00.552603 | orchestrator | Saturday 27 September 2025 00:19:57 +0000 (0:00:00.515) 0:00:05.998 **** 2025-09-27 00:20:00.552614 | orchestrator | changed: [testbed-manager] 2025-09-27 00:20:00.552625 | orchestrator | 2025-09-27 00:20:00.552636 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2025-09-27 00:20:00.552647 | orchestrator | Saturday 27 September 2025 00:19:58 +0000 (0:00:01.027) 0:00:07.025 **** 2025-09-27 00:20:00.552658 | orchestrator | ok: [testbed-manager] 2025-09-27 00:20:00.552669 | orchestrator | 2025-09-27 00:20:00.552680 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2025-09-27 00:20:00.552691 | orchestrator | Saturday 27 September 2025 00:19:59 +0000 (0:00:00.928) 0:00:07.953 **** 2025-09-27 00:20:00.552710 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager 2025-09-27 00:20:00.552721 | orchestrator | 2025-09-27 00:20:00.552732 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2025-09-27 00:20:00.552743 | orchestrator | Saturday 27 September 2025 00:19:59 +0000 (0:00:00.083) 0:00:08.037 **** 2025-09-27 00:20:00.552754 | orchestrator | changed: [testbed-manager] 2025-09-27 00:20:00.552765 | orchestrator | 2025-09-27 00:20:00.552775 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:20:00.552788 | orchestrator | testbed-manager : ok=10  changed=3  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-09-27 00:20:00.552799 | orchestrator | 2025-09-27 00:20:00.552811 | orchestrator | 2025-09-27 00:20:00.552822 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:20:00.552833 | orchestrator | Saturday 27 September 2025 00:20:00 +0000 (0:00:01.124) 0:00:09.161 **** 2025-09-27 00:20:00.552844 | orchestrator | =============================================================================== 2025-09-27 00:20:00.552855 | orchestrator | Gathering Facts --------------------------------------------------------- 3.55s 2025-09-27 00:20:00.552865 | orchestrator | osism.commons.resolvconf : Restart systemd-resolved service ------------- 1.12s 2025-09-27 00:20:00.552876 | orchestrator | osism.commons.resolvconf : Copy configuration files --------------------- 1.03s 2025-09-27 00:20:00.552887 | orchestrator | osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf --- 0.98s 2025-09-27 00:20:00.552910 | orchestrator | osism.commons.resolvconf : Start/enable systemd-resolved service -------- 0.93s 2025-09-27 00:20:00.552921 | orchestrator | osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf --- 0.52s 2025-09-27 00:20:00.552952 | orchestrator | osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf ----- 0.46s 2025-09-27 00:20:00.552963 | orchestrator | osism.commons.resolvconf : Include distribution specific configuration tasks --- 0.08s 2025-09-27 00:20:00.552974 | orchestrator | osism.commons.resolvconf : Include resolvconf tasks --------------------- 0.08s 2025-09-27 00:20:00.552985 | orchestrator | osism.commons.resolvconf : Archive existing file /etc/resolv.conf ------- 0.08s 2025-09-27 00:20:00.552996 | orchestrator | osism.commons.resolvconf : Include distribution specific installation tasks --- 0.07s 2025-09-27 00:20:00.553006 | orchestrator | osism.commons.resolvconf : Check minimum and maximum number of name servers --- 0.06s 2025-09-27 00:20:00.553017 | orchestrator | osism.commons.resolvconf : Install package systemd-resolved ------------- 0.05s 2025-09-27 00:20:00.816430 | orchestrator | + osism apply sshconfig 2025-09-27 00:20:12.843323 | orchestrator | 2025-09-27 00:20:12 | INFO  | Task e6f43930-c9a9-4530-9146-3885149ef020 (sshconfig) was prepared for execution. 2025-09-27 00:20:12.843460 | orchestrator | 2025-09-27 00:20:12 | INFO  | It takes a moment until task e6f43930-c9a9-4530-9146-3885149ef020 (sshconfig) has been started and output is visible here. 2025-09-27 00:20:23.551678 | orchestrator | 2025-09-27 00:20:23.551798 | orchestrator | PLAY [Apply role sshconfig] **************************************************** 2025-09-27 00:20:23.551816 | orchestrator | 2025-09-27 00:20:23.551837 | orchestrator | TASK [osism.commons.sshconfig : Get home directory of operator user] *********** 2025-09-27 00:20:23.551858 | orchestrator | Saturday 27 September 2025 00:20:16 +0000 (0:00:00.120) 0:00:00.120 **** 2025-09-27 00:20:23.551879 | orchestrator | ok: [testbed-manager] 2025-09-27 00:20:23.551900 | orchestrator | 2025-09-27 00:20:23.551921 | orchestrator | TASK [osism.commons.sshconfig : Ensure .ssh/config.d exist] ******************** 2025-09-27 00:20:23.551941 | orchestrator | Saturday 27 September 2025 00:20:17 +0000 (0:00:00.469) 0:00:00.589 **** 2025-09-27 00:20:23.551959 | orchestrator | changed: [testbed-manager] 2025-09-27 00:20:23.551971 | orchestrator | 2025-09-27 00:20:23.551982 | orchestrator | TASK [osism.commons.sshconfig : Ensure config for each host exist] ************* 2025-09-27 00:20:23.551993 | orchestrator | Saturday 27 September 2025 00:20:17 +0000 (0:00:00.439) 0:00:01.028 **** 2025-09-27 00:20:23.552034 | orchestrator | changed: [testbed-manager] => (item=testbed-manager) 2025-09-27 00:20:23.552045 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0) 2025-09-27 00:20:23.552056 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1) 2025-09-27 00:20:23.552067 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2) 2025-09-27 00:20:23.552078 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3) 2025-09-27 00:20:23.552089 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4) 2025-09-27 00:20:23.552100 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5) 2025-09-27 00:20:23.552110 | orchestrator | 2025-09-27 00:20:23.552121 | orchestrator | TASK [osism.commons.sshconfig : Add extra config] ****************************** 2025-09-27 00:20:23.552132 | orchestrator | Saturday 27 September 2025 00:20:22 +0000 (0:00:05.136) 0:00:06.165 **** 2025-09-27 00:20:23.552145 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:20:23.552164 | orchestrator | 2025-09-27 00:20:23.552181 | orchestrator | TASK [osism.commons.sshconfig : Assemble ssh config] *************************** 2025-09-27 00:20:23.552199 | orchestrator | Saturday 27 September 2025 00:20:22 +0000 (0:00:00.074) 0:00:06.239 **** 2025-09-27 00:20:23.552217 | orchestrator | changed: [testbed-manager] 2025-09-27 00:20:23.552288 | orchestrator | 2025-09-27 00:20:23.552307 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:20:23.552328 | orchestrator | testbed-manager : ok=4  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:20:23.552348 | orchestrator | 2025-09-27 00:20:23.552367 | orchestrator | 2025-09-27 00:20:23.552387 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:20:23.552406 | orchestrator | Saturday 27 September 2025 00:20:23 +0000 (0:00:00.574) 0:00:06.814 **** 2025-09-27 00:20:23.552425 | orchestrator | =============================================================================== 2025-09-27 00:20:23.552443 | orchestrator | osism.commons.sshconfig : Ensure config for each host exist ------------- 5.14s 2025-09-27 00:20:23.552463 | orchestrator | osism.commons.sshconfig : Assemble ssh config --------------------------- 0.57s 2025-09-27 00:20:23.552483 | orchestrator | osism.commons.sshconfig : Get home directory of operator user ----------- 0.47s 2025-09-27 00:20:23.552502 | orchestrator | osism.commons.sshconfig : Ensure .ssh/config.d exist -------------------- 0.44s 2025-09-27 00:20:23.552522 | orchestrator | osism.commons.sshconfig : Add extra config ------------------------------ 0.07s 2025-09-27 00:20:23.807611 | orchestrator | + osism apply known-hosts 2025-09-27 00:20:35.720844 | orchestrator | 2025-09-27 00:20:35 | INFO  | Task 36e5b80d-f8e0-4f4e-bd96-ce0090aae365 (known-hosts) was prepared for execution. 2025-09-27 00:20:35.720954 | orchestrator | 2025-09-27 00:20:35 | INFO  | It takes a moment until task 36e5b80d-f8e0-4f4e-bd96-ce0090aae365 (known-hosts) has been started and output is visible here. 2025-09-27 00:20:51.953275 | orchestrator | 2025-09-27 00:20:51.953394 | orchestrator | PLAY [Apply role known_hosts] ************************************************** 2025-09-27 00:20:51.953412 | orchestrator | 2025-09-27 00:20:51.953424 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname] *** 2025-09-27 00:20:51.953436 | orchestrator | Saturday 27 September 2025 00:20:39 +0000 (0:00:00.160) 0:00:00.160 **** 2025-09-27 00:20:51.953448 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2025-09-27 00:20:51.953460 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2025-09-27 00:20:51.953471 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2025-09-27 00:20:51.953482 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2025-09-27 00:20:51.953493 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2025-09-27 00:20:51.953504 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2025-09-27 00:20:51.953515 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2025-09-27 00:20:51.953526 | orchestrator | 2025-09-27 00:20:51.953537 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname] *** 2025-09-27 00:20:51.953572 | orchestrator | Saturday 27 September 2025 00:20:45 +0000 (0:00:05.868) 0:00:06.028 **** 2025-09-27 00:20:51.953596 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2025-09-27 00:20:51.953610 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2025-09-27 00:20:51.953621 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2025-09-27 00:20:51.953632 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2025-09-27 00:20:51.953643 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2025-09-27 00:20:51.953654 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2025-09-27 00:20:51.953665 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2025-09-27 00:20:51.953676 | orchestrator | 2025-09-27 00:20:51.953688 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:20:51.953699 | orchestrator | Saturday 27 September 2025 00:20:45 +0000 (0:00:00.182) 0:00:06.211 **** 2025-09-27 00:20:51.953710 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAyg+Sy5xg3Q5m/MA2C9M2NqdxeDaePZyDLf/6+/xU7PlhGFPo6a+XULwIjxPcYI+DtD2WW/7BbZNq0wqKtFrCY=) 2025-09-27 00:20:51.953726 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9xwum8eDGd08I0j1wtSMOaMrEpoYu4XZqepdN1oBeZv+U/JLhJf0lsNMtykqt6UJBrrBxv8f0RFnRsW3wJg2PipskKRiS0Ae5Oh5qiAGzu9Yc8rC6VCOReOu3En7+nntrdU63TCUGMopn6ATOobZStUrCz+bYFIV3VYqdkgQXAj/GCiVYelnXPFXfuG+fH9ewe0Mox/EjRNIMD46WTdPLaDs5ygqW9pcSvghgY6w5b4ygdTxSob0m02uvWqH+MUD5TNmM9U5hpY/c3EyRLjZWevh40n50NFK0bDYffIqllJgMBJZq+B+kuSdttFNTZp8DBvmJ5nRlMuI21k+lGf1vhd9horf8aMsHUwdJthQAteZXxP0TEG170BWoFKa6mLxjf062UaLlHJhlgtj9lZIQUJ6Jl+Qtrg4wbtjDBLJneBLYCzOwPIRM8KBNrJU5XcDoK395ZRgFScqLmB9HxwaokEU13Hx6kHimyXAHX+iXasKl4PNCgiX8M/YAMMnUfSs=) 2025-09-27 00:20:51.953753 | orchestrator | changed: [testbed-manager] => (item=testbed-manager ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHeyYV/ZrHeyPKqiJTwP8IViVsXEmTZGHm5iJMcohLFw) 2025-09-27 00:20:51.953766 | orchestrator | 2025-09-27 00:20:51.953777 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:20:51.953788 | orchestrator | Saturday 27 September 2025 00:20:46 +0000 (0:00:01.159) 0:00:07.370 **** 2025-09-27 00:20:51.953799 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK2aJj+Xuv29WUXOikSwwFiEUnTKyVwruoIs7wFvXISUlET7GsK7YD8d1yBFn2vdDPY7GlxsqFCKwReh7NbGisY=) 2025-09-27 00:20:51.953810 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKmR5nndWQ8wQ6D8Cs3LBdaRUs1P30FwXhPu6nxyMWk) 2025-09-27 00:20:51.953851 | orchestrator | changed: [testbed-manager] => (item=testbed-node-3 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/z6XxvoWbCU3Ub1D6j9Th72n4L79HV4l0tfVLUr4LlVa7dApEGJt9hz/lCwcjvqpadp9wmW0hTQFVh5Dg2IOIJ1b36mnLCoKeOjS+719wPyIMraWnUgANdwRf6DCl3rBpxUGBbYJjdTvR49DHN4tP6WGxv65jAdbFvYYuH3QTjbJ/HPDogAvakVR7ArSejjsHEFkf5hZ8/xPfMHljGMWK9RRBYWlDVxYE+ekOdIA5cjPJ1WLkrzkLSsNNQ3MgV4TQTwOa9nO1egZAbM5UMLUTUTjgHXGf37r89qj0zX/eeb+OK4GUBtq1eyy6Y95vQMbxl9dEOxeVNngGPAhCSQoxFNl3pPesa2d8zW4eKDuMrjHupEROmK6B7t0N7Ws+Kk9ouFjzhyvvkF1DJIif8TAffz1Csgqa9lsbTnGYE2p2VN/x0JF62/eXTyDmu4dxiWf0mOjWF3MLFyalb8O5+QwmWAscUInRxbE3khE6jNcPB6ojNekRItSAgH+s4unwmo0=) 2025-09-27 00:20:51.953874 | orchestrator | 2025-09-27 00:20:51.953885 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:20:51.953896 | orchestrator | Saturday 27 September 2025 00:20:47 +0000 (0:00:01.027) 0:00:08.398 **** 2025-09-27 00:20:51.953907 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINGhE9uA1eYv0mFxI7CgXTinCkMwUvGBK/4XPlapywua) 2025-09-27 00:20:51.953919 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA7wakZ1Ufx6HYaGI8FNsKTa7mstQo9Nu4VoreHLzgw8m9BrLtmxaB+HjeUVSNcespK/2v8C3qbzOLIHTtyBpXSaC3LRCmyxcu/HEQS18EtyyF/p4SI0WxUHeDJ3psdYidU1WFF3Cqbv8manFdIi5d+Sw3bHITxGi+9caBwWWdGpzSr1fROOLvuyis4wnbaNaxbkSwNSFR7RAfZ7VWvf2ES1eWKDh4Z2VoH80ISUVsJ1vJx6fR3fq9A7zvnppaJLg8q7rvKT3OMpwI44/LSvju6fFk0r5RkKp7prtZzGsuDrggM7Am/+gcnNGY4RL3ngyiMgUZC7Hy50kPpmyZ8Qi+u2i5ZU1knbeKBJIBYCQu7ZZGX1yae2qFYQMQVjJQvLzz1IsAf+kMWRMVMfIZqpmww/J73XDXnD4+xUzroGBxyBxXTjHgrc+31Wj4Kf99ppFknLrrAUE0Jbew2/ZBkRz0GDOmAqCPkXWWKS0u8CMI1q1NAX7FL5Hywgy3RwryrIk=) 2025-09-27 00:20:51.954001 | orchestrator | changed: [testbed-manager] => (item=testbed-node-4 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPY9cA3jLoyiHPmzk9fNrYdEdVbQzuuCLYIE0mADIXuIhqgaXOCRh/ZLAAtJ0Yl3PUtDTPEkBx6w6JMErXgKIX0=) 2025-09-27 00:20:51.954085 | orchestrator | 2025-09-27 00:20:51.954102 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:20:51.954113 | orchestrator | Saturday 27 September 2025 00:20:48 +0000 (0:00:01.064) 0:00:09.462 **** 2025-09-27 00:20:51.954124 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICvZ15kWDwr1NRszMrVqIMmZKPQfyxrJZLzFUuByGtDM) 2025-09-27 00:20:51.954135 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCuIKbPf5woFeg5uanfw9YwVdHSi+rGJCLiU+Wcu3rW+PIDRK+rByxL7F/AI1N+W6P21iyz3YBSviaAXrpK9YgOyPPfAWThU9iLlrkSXi59Le3FkHb9zcZJep/ySf/6+Hx9zUPwjIH0f48gT79iIww//nE0Yjm04klVYWmcu8tR/CRMUcPRFkhwZwut/4TRj3CMEiLwrDLbAlUGpe2ufuJe74esIo5bLvlUypOmwpuxwTJP4kgehAnftaZOmx2d+KRTt97LeAjVN7NGuQf+X8nB2JXCWZ5CHQr2JJL1ySvYcRRbZ7a1zRuEcvDz7XW2TH06gHHgnlQGEggolRdXP3ZgPs9BOigKoFIji31w4S2ZORQBzJ7DHqMmuK+88O19A79akaWLD2LlbfAGXH1TgVv6uXDL/jcrbRfeQ0gzOwLRuLBqkalAJY55DTivpje/NdTQFjwpFmqfOW2UmxUf8T4rkvUtny8Xow+s38YI5xt3Wm/Ga4Az/2sgpaOY2QpKXi8=) 2025-09-27 00:20:51.954147 | orchestrator | changed: [testbed-manager] => (item=testbed-node-5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJsTs/eGKH2Naj0Zc9/6O9377D36mql7wK3judbldpT10m0oR++IlYQi5lXunzwfdJuy8ra55tJiEUY9lfOAKCM=) 2025-09-27 00:20:51.954158 | orchestrator | 2025-09-27 00:20:51.954169 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:20:51.954180 | orchestrator | Saturday 27 September 2025 00:20:49 +0000 (0:00:01.045) 0:00:10.507 **** 2025-09-27 00:20:51.954191 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIH1i/gUZcRaeuaMATQYgDw6ebO+dwxNKalbE/+CKnnC+) 2025-09-27 00:20:51.954202 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDPzXR2bCAs2Rn2tZ4ePdjB5MdQlvXRPpt+b46KIhsMnE8TZc32pih2XpE3kqanzB/kauRemrizA85FB0M/fYNpMN8VtoXhUp+Ps0/Ivmk4C4lIFCf31cTPyRhzf1xbfVie2iHYQYulhWoJKLz4F6W8je87Ivv4E1hqLOgoDtQ19+ghdjAqPRlbjo6YbywlPVV/RRhnJ87OEEp3u2lYj+1SGA5vaGAr72QBCFoQ/ePSj1itgc9+8bvDiPNdyvjGN1N3kBWHgx7NDDUaEDkQMqukf6x4+11AH3CxvHxL9Ay4bXhIFz6GZHCru0iROL28qAV5Tt56UReFvanXVcQFmmvDjiAdtNspFLfpe0qmOurpjmOuRE8dupys6s6eiX8sksqFxWYWN2ym3z2ygtQw02r9c3aBT+LngiO8iw9dCQ33sru0ECT/lgESmMySWkcwqNCRMnF3/j98ahcL+vvuumnW6WNoZkILAmKw2LoMLDJtaQ2fKUGvZBQPKBjrYSE+VME=) 2025-09-27 00:20:51.954232 | orchestrator | changed: [testbed-manager] => (item=testbed-node-0 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGqNhYhvp7eCR5nbz9NQqsnoxphrOh5BfFuQ6DpzwPh+RBel13IR9N1xtw8NMZuasNpyP0JbsS4nlCdyYLXZL48=) 2025-09-27 00:20:51.954253 | orchestrator | 2025-09-27 00:20:51.954265 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:20:51.954276 | orchestrator | Saturday 27 September 2025 00:20:50 +0000 (0:00:01.042) 0:00:11.549 **** 2025-09-27 00:20:51.954295 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKSdz0uGmgsPkmmMWcV6bFV9jbVFB9x65mKTRpJYEeMH) 2025-09-27 00:21:02.896634 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCLumMIHKpkfJbPw084wLzTXJUduprz8yRgTPYG4ocC0+oy17V/1BHiVQDEtUIWLV7TS6KZUqrtemERKQze7fb/ocr+GwOW9zah6hw1VzcZtJZDbC5jNLh9+Qws9eJrO0ZH6JNppCO3hkJ7QaxxyQU4XisX9m5M76w+JWZY72oQQTxgs7RBmODxBtL58X5r6NxXr7wgzwbRk9KCi4MkI+1Zgy2VOctnHHnx8H9qSWEDA6fV9G6h8SeAPSxnTdSpnEiWnsshA/8Q4FuBds1EuNwhzgGIlj/V/iuOUqVfIUAWEd4ChKAYYjcr9hFLzzv4/8Q/0dU/+0FYWRyjOgkykLnnWvoMonxmd7w354dxUqsXH+rspK06DapAvpTD9XoK/OMLWmR0m9DyS64sCQ2o8orNdE7UtEdHs+EcJyH8D8L90aTNVv40WAUGXDJ5L/66mWusDrlYooaUWZI0sHHSdag346RoBb5UNDcmRUg/mZRCzgd5EacJf9Gqf8v3EbxiAqE=) 2025-09-27 00:21:02.896757 | orchestrator | changed: [testbed-manager] => (item=testbed-node-1 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGVul4WqOq1RVxQha70CRcWi+UWVUrIwb26b+40HUj5tD9LrTjSza/EI9UomwwhcsnOqMoaXxGkTOe5LB4uvaXg=) 2025-09-27 00:21:02.896775 | orchestrator | 2025-09-27 00:21:02.896788 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:21:02.896801 | orchestrator | Saturday 27 September 2025 00:20:51 +0000 (0:00:01.001) 0:00:12.551 **** 2025-09-27 00:21:02.896812 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAuA18UYhdq2NSEYND3BSDs7VpzxFaeuUC5Vde6mtXIAoRItbzrUl3/FkqP1i23QS4nfwqua1NOnz1+JS+pLUfg=) 2025-09-27 00:21:02.896825 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCb2AVp9bbTXiH+Cx/ocHUpkq/cBrj0iKG0Jgzd6GDY62laFhzBm0iQKPoYGSfIp0/Pvh/ZXuYQxuDhC2FtbzdOPRZeHb6CeRAht9dMop3FfDlWZpnX8+KZjDq8pWIFQE2sP++lIjUR1TS1NcyWXIt/1vCDAbz/FY7eurxIvbOlqZxwfp9kmweAUSVIZ5HkD8X0qhcp5xrLmv7NcCkbUOUQKgHQNanC7fIzOSlFKttJLnsYdW+RXhsZEhLXIj1YeYPRmp+w+gnt10Zjc/Pmhex/IArvo00tMDRf7FSFdXdGbPxf/K1ozt/Ljsfei1DZt6ZAyhEuLa/rGsrl4FjTZnI02MEq6SEUoKsRuMow4X6qmc6WuPcNpTSZ9n6yp+lEe3uWf92SAAHrdAHGuO93Got4Y4jK5zto6xiAsmpXHpto/Jap7qJcSKzuQquMChcgEGne7lRNoNRzXEU3/oxoR7y8VfDLMFwXnnKMTgCuXNnA0wClEA5iKNaWQX0emOrfJKU=) 2025-09-27 00:21:02.896837 | orchestrator | changed: [testbed-manager] => (item=testbed-node-2 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKzkDR3mp70Slg+gUcYoYp/Jm04eC5KnrHcjcocYiziO) 2025-09-27 00:21:02.896850 | orchestrator | 2025-09-27 00:21:02.896862 | orchestrator | TASK [osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host] *** 2025-09-27 00:21:02.896874 | orchestrator | Saturday 27 September 2025 00:20:52 +0000 (0:00:01.049) 0:00:13.600 **** 2025-09-27 00:21:02.896885 | orchestrator | ok: [testbed-manager] => (item=testbed-manager) 2025-09-27 00:21:02.896897 | orchestrator | ok: [testbed-manager] => (item=testbed-node-3) 2025-09-27 00:21:02.896908 | orchestrator | ok: [testbed-manager] => (item=testbed-node-4) 2025-09-27 00:21:02.896919 | orchestrator | ok: [testbed-manager] => (item=testbed-node-5) 2025-09-27 00:21:02.896930 | orchestrator | ok: [testbed-manager] => (item=testbed-node-0) 2025-09-27 00:21:02.896961 | orchestrator | ok: [testbed-manager] => (item=testbed-node-1) 2025-09-27 00:21:02.896973 | orchestrator | ok: [testbed-manager] => (item=testbed-node-2) 2025-09-27 00:21:02.896984 | orchestrator | 2025-09-27 00:21:02.896996 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host] *** 2025-09-27 00:21:02.897008 | orchestrator | Saturday 27 September 2025 00:20:58 +0000 (0:00:05.283) 0:00:18.884 **** 2025-09-27 00:21:02.897020 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-manager) 2025-09-27 00:21:02.897054 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-3) 2025-09-27 00:21:02.897066 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-4) 2025-09-27 00:21:02.897077 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-5) 2025-09-27 00:21:02.897088 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-0) 2025-09-27 00:21:02.897099 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-1) 2025-09-27 00:21:02.897110 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/known_hosts/tasks/write-scanned.yml for testbed-manager => (item=Scanned entries of testbed-node-2) 2025-09-27 00:21:02.897121 | orchestrator | 2025-09-27 00:21:02.897148 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:21:02.897162 | orchestrator | Saturday 27 September 2025 00:20:58 +0000 (0:00:00.171) 0:00:19.055 **** 2025-09-27 00:21:02.897176 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHeyYV/ZrHeyPKqiJTwP8IViVsXEmTZGHm5iJMcohLFw) 2025-09-27 00:21:02.897192 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9xwum8eDGd08I0j1wtSMOaMrEpoYu4XZqepdN1oBeZv+U/JLhJf0lsNMtykqt6UJBrrBxv8f0RFnRsW3wJg2PipskKRiS0Ae5Oh5qiAGzu9Yc8rC6VCOReOu3En7+nntrdU63TCUGMopn6ATOobZStUrCz+bYFIV3VYqdkgQXAj/GCiVYelnXPFXfuG+fH9ewe0Mox/EjRNIMD46WTdPLaDs5ygqW9pcSvghgY6w5b4ygdTxSob0m02uvWqH+MUD5TNmM9U5hpY/c3EyRLjZWevh40n50NFK0bDYffIqllJgMBJZq+B+kuSdttFNTZp8DBvmJ5nRlMuI21k+lGf1vhd9horf8aMsHUwdJthQAteZXxP0TEG170BWoFKa6mLxjf062UaLlHJhlgtj9lZIQUJ6Jl+Qtrg4wbtjDBLJneBLYCzOwPIRM8KBNrJU5XcDoK395ZRgFScqLmB9HxwaokEU13Hx6kHimyXAHX+iXasKl4PNCgiX8M/YAMMnUfSs=) 2025-09-27 00:21:02.897236 | orchestrator | changed: [testbed-manager] => (item=192.168.16.5 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAyg+Sy5xg3Q5m/MA2C9M2NqdxeDaePZyDLf/6+/xU7PlhGFPo6a+XULwIjxPcYI+DtD2WW/7BbZNq0wqKtFrCY=) 2025-09-27 00:21:02.897251 | orchestrator | 2025-09-27 00:21:02.897264 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:21:02.897277 | orchestrator | Saturday 27 September 2025 00:20:59 +0000 (0:00:01.101) 0:00:20.157 **** 2025-09-27 00:21:02.897290 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/z6XxvoWbCU3Ub1D6j9Th72n4L79HV4l0tfVLUr4LlVa7dApEGJt9hz/lCwcjvqpadp9wmW0hTQFVh5Dg2IOIJ1b36mnLCoKeOjS+719wPyIMraWnUgANdwRf6DCl3rBpxUGBbYJjdTvR49DHN4tP6WGxv65jAdbFvYYuH3QTjbJ/HPDogAvakVR7ArSejjsHEFkf5hZ8/xPfMHljGMWK9RRBYWlDVxYE+ekOdIA5cjPJ1WLkrzkLSsNNQ3MgV4TQTwOa9nO1egZAbM5UMLUTUTjgHXGf37r89qj0zX/eeb+OK4GUBtq1eyy6Y95vQMbxl9dEOxeVNngGPAhCSQoxFNl3pPesa2d8zW4eKDuMrjHupEROmK6B7t0N7Ws+Kk9ouFjzhyvvkF1DJIif8TAffz1Csgqa9lsbTnGYE2p2VN/x0JF62/eXTyDmu4dxiWf0mOjWF3MLFyalb8O5+QwmWAscUInRxbE3khE6jNcPB6ojNekRItSAgH+s4unwmo0=) 2025-09-27 00:21:02.897304 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBK2aJj+Xuv29WUXOikSwwFiEUnTKyVwruoIs7wFvXISUlET7GsK7YD8d1yBFn2vdDPY7GlxsqFCKwReh7NbGisY=) 2025-09-27 00:21:02.897317 | orchestrator | changed: [testbed-manager] => (item=192.168.16.13 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKmR5nndWQ8wQ6D8Cs3LBdaRUs1P30FwXhPu6nxyMWk) 2025-09-27 00:21:02.897329 | orchestrator | 2025-09-27 00:21:02.897342 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:21:02.897363 | orchestrator | Saturday 27 September 2025 00:21:00 +0000 (0:00:01.100) 0:00:21.258 **** 2025-09-27 00:21:02.897377 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAINGhE9uA1eYv0mFxI7CgXTinCkMwUvGBK/4XPlapywua) 2025-09-27 00:21:02.897391 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA7wakZ1Ufx6HYaGI8FNsKTa7mstQo9Nu4VoreHLzgw8m9BrLtmxaB+HjeUVSNcespK/2v8C3qbzOLIHTtyBpXSaC3LRCmyxcu/HEQS18EtyyF/p4SI0WxUHeDJ3psdYidU1WFF3Cqbv8manFdIi5d+Sw3bHITxGi+9caBwWWdGpzSr1fROOLvuyis4wnbaNaxbkSwNSFR7RAfZ7VWvf2ES1eWKDh4Z2VoH80ISUVsJ1vJx6fR3fq9A7zvnppaJLg8q7rvKT3OMpwI44/LSvju6fFk0r5RkKp7prtZzGsuDrggM7Am/+gcnNGY4RL3ngyiMgUZC7Hy50kPpmyZ8Qi+u2i5ZU1knbeKBJIBYCQu7ZZGX1yae2qFYQMQVjJQvLzz1IsAf+kMWRMVMfIZqpmww/J73XDXnD4+xUzroGBxyBxXTjHgrc+31Wj4Kf99ppFknLrrAUE0Jbew2/ZBkRz0GDOmAqCPkXWWKS0u8CMI1q1NAX7FL5Hywgy3RwryrIk=) 2025-09-27 00:21:02.897404 | orchestrator | changed: [testbed-manager] => (item=192.168.16.14 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPY9cA3jLoyiHPmzk9fNrYdEdVbQzuuCLYIE0mADIXuIhqgaXOCRh/ZLAAtJ0Yl3PUtDTPEkBx6w6JMErXgKIX0=) 2025-09-27 00:21:02.897417 | orchestrator | 2025-09-27 00:21:02.897430 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:21:02.897443 | orchestrator | Saturday 27 September 2025 00:21:01 +0000 (0:00:01.099) 0:00:22.358 **** 2025-09-27 00:21:02.897456 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJsTs/eGKH2Naj0Zc9/6O9377D36mql7wK3judbldpT10m0oR++IlYQi5lXunzwfdJuy8ra55tJiEUY9lfOAKCM=) 2025-09-27 00:21:02.897499 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCuIKbPf5woFeg5uanfw9YwVdHSi+rGJCLiU+Wcu3rW+PIDRK+rByxL7F/AI1N+W6P21iyz3YBSviaAXrpK9YgOyPPfAWThU9iLlrkSXi59Le3FkHb9zcZJep/ySf/6+Hx9zUPwjIH0f48gT79iIww//nE0Yjm04klVYWmcu8tR/CRMUcPRFkhwZwut/4TRj3CMEiLwrDLbAlUGpe2ufuJe74esIo5bLvlUypOmwpuxwTJP4kgehAnftaZOmx2d+KRTt97LeAjVN7NGuQf+X8nB2JXCWZ5CHQr2JJL1ySvYcRRbZ7a1zRuEcvDz7XW2TH06gHHgnlQGEggolRdXP3ZgPs9BOigKoFIji31w4S2ZORQBzJ7DHqMmuK+88O19A79akaWLD2LlbfAGXH1TgVv6uXDL/jcrbRfeQ0gzOwLRuLBqkalAJY55DTivpje/NdTQFjwpFmqfOW2UmxUf8T4rkvUtny8Xow+s38YI5xt3Wm/Ga4Az/2sgpaOY2QpKXi8=) 2025-09-27 00:21:07.224860 | orchestrator | changed: [testbed-manager] => (item=192.168.16.15 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICvZ15kWDwr1NRszMrVqIMmZKPQfyxrJZLzFUuByGtDM) 2025-09-27 00:21:07.224964 | orchestrator | 2025-09-27 00:21:07.224981 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:21:07.224995 | orchestrator | Saturday 27 September 2025 00:21:02 +0000 (0:00:01.135) 0:00:23.493 **** 2025-09-27 00:21:07.225008 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDPzXR2bCAs2Rn2tZ4ePdjB5MdQlvXRPpt+b46KIhsMnE8TZc32pih2XpE3kqanzB/kauRemrizA85FB0M/fYNpMN8VtoXhUp+Ps0/Ivmk4C4lIFCf31cTPyRhzf1xbfVie2iHYQYulhWoJKLz4F6W8je87Ivv4E1hqLOgoDtQ19+ghdjAqPRlbjo6YbywlPVV/RRhnJ87OEEp3u2lYj+1SGA5vaGAr72QBCFoQ/ePSj1itgc9+8bvDiPNdyvjGN1N3kBWHgx7NDDUaEDkQMqukf6x4+11AH3CxvHxL9Ay4bXhIFz6GZHCru0iROL28qAV5Tt56UReFvanXVcQFmmvDjiAdtNspFLfpe0qmOurpjmOuRE8dupys6s6eiX8sksqFxWYWN2ym3z2ygtQw02r9c3aBT+LngiO8iw9dCQ33sru0ECT/lgESmMySWkcwqNCRMnF3/j98ahcL+vvuumnW6WNoZkILAmKw2LoMLDJtaQ2fKUGvZBQPKBjrYSE+VME=) 2025-09-27 00:21:07.225023 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIH1i/gUZcRaeuaMATQYgDw6ebO+dwxNKalbE/+CKnnC+) 2025-09-27 00:21:07.225035 | orchestrator | changed: [testbed-manager] => (item=192.168.16.10 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGqNhYhvp7eCR5nbz9NQqsnoxphrOh5BfFuQ6DpzwPh+RBel13IR9N1xtw8NMZuasNpyP0JbsS4nlCdyYLXZL48=) 2025-09-27 00:21:07.225047 | orchestrator | 2025-09-27 00:21:07.225058 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:21:07.225069 | orchestrator | Saturday 27 September 2025 00:21:03 +0000 (0:00:01.058) 0:00:24.552 **** 2025-09-27 00:21:07.225108 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGVul4WqOq1RVxQha70CRcWi+UWVUrIwb26b+40HUj5tD9LrTjSza/EI9UomwwhcsnOqMoaXxGkTOe5LB4uvaXg=) 2025-09-27 00:21:07.225119 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKSdz0uGmgsPkmmMWcV6bFV9jbVFB9x65mKTRpJYEeMH) 2025-09-27 00:21:07.225165 | orchestrator | changed: [testbed-manager] => (item=192.168.16.11 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCLumMIHKpkfJbPw084wLzTXJUduprz8yRgTPYG4ocC0+oy17V/1BHiVQDEtUIWLV7TS6KZUqrtemERKQze7fb/ocr+GwOW9zah6hw1VzcZtJZDbC5jNLh9+Qws9eJrO0ZH6JNppCO3hkJ7QaxxyQU4XisX9m5M76w+JWZY72oQQTxgs7RBmODxBtL58X5r6NxXr7wgzwbRk9KCi4MkI+1Zgy2VOctnHHnx8H9qSWEDA6fV9G6h8SeAPSxnTdSpnEiWnsshA/8Q4FuBds1EuNwhzgGIlj/V/iuOUqVfIUAWEd4ChKAYYjcr9hFLzzv4/8Q/0dU/+0FYWRyjOgkykLnnWvoMonxmd7w354dxUqsXH+rspK06DapAvpTD9XoK/OMLWmR0m9DyS64sCQ2o8orNdE7UtEdHs+EcJyH8D8L90aTNVv40WAUGXDJ5L/66mWusDrlYooaUWZI0sHHSdag346RoBb5UNDcmRUg/mZRCzgd5EacJf9Gqf8v3EbxiAqE=) 2025-09-27 00:21:07.225178 | orchestrator | 2025-09-27 00:21:07.225189 | orchestrator | TASK [osism.commons.known_hosts : Write scanned known_hosts entries] *********** 2025-09-27 00:21:07.225200 | orchestrator | Saturday 27 September 2025 00:21:04 +0000 (0:00:01.039) 0:00:25.591 **** 2025-09-27 00:21:07.225258 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBAuA18UYhdq2NSEYND3BSDs7VpzxFaeuUC5Vde6mtXIAoRItbzrUl3/FkqP1i23QS4nfwqua1NOnz1+JS+pLUfg=) 2025-09-27 00:21:07.225271 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCb2AVp9bbTXiH+Cx/ocHUpkq/cBrj0iKG0Jgzd6GDY62laFhzBm0iQKPoYGSfIp0/Pvh/ZXuYQxuDhC2FtbzdOPRZeHb6CeRAht9dMop3FfDlWZpnX8+KZjDq8pWIFQE2sP++lIjUR1TS1NcyWXIt/1vCDAbz/FY7eurxIvbOlqZxwfp9kmweAUSVIZ5HkD8X0qhcp5xrLmv7NcCkbUOUQKgHQNanC7fIzOSlFKttJLnsYdW+RXhsZEhLXIj1YeYPRmp+w+gnt10Zjc/Pmhex/IArvo00tMDRf7FSFdXdGbPxf/K1ozt/Ljsfei1DZt6ZAyhEuLa/rGsrl4FjTZnI02MEq6SEUoKsRuMow4X6qmc6WuPcNpTSZ9n6yp+lEe3uWf92SAAHrdAHGuO93Got4Y4jK5zto6xiAsmpXHpto/Jap7qJcSKzuQquMChcgEGne7lRNoNRzXEU3/oxoR7y8VfDLMFwXnnKMTgCuXNnA0wClEA5iKNaWQX0emOrfJKU=) 2025-09-27 00:21:07.225282 | orchestrator | changed: [testbed-manager] => (item=192.168.16.12 ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKzkDR3mp70Slg+gUcYoYp/Jm04eC5KnrHcjcocYiziO) 2025-09-27 00:21:07.225293 | orchestrator | 2025-09-27 00:21:07.225304 | orchestrator | TASK [osism.commons.known_hosts : Write static known_hosts entries] ************ 2025-09-27 00:21:07.225315 | orchestrator | Saturday 27 September 2025 00:21:06 +0000 (0:00:01.034) 0:00:26.626 **** 2025-09-27 00:21:07.225327 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2025-09-27 00:21:07.225338 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2025-09-27 00:21:07.225349 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2025-09-27 00:21:07.225360 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2025-09-27 00:21:07.225371 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2025-09-27 00:21:07.225400 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2025-09-27 00:21:07.225414 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2025-09-27 00:21:07.225427 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:21:07.225440 | orchestrator | 2025-09-27 00:21:07.225453 | orchestrator | TASK [osism.commons.known_hosts : Write extra known_hosts entries] ************* 2025-09-27 00:21:07.225465 | orchestrator | Saturday 27 September 2025 00:21:06 +0000 (0:00:00.167) 0:00:26.793 **** 2025-09-27 00:21:07.225478 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:21:07.225491 | orchestrator | 2025-09-27 00:21:07.225505 | orchestrator | TASK [osism.commons.known_hosts : Delete known_hosts entries] ****************** 2025-09-27 00:21:07.225518 | orchestrator | Saturday 27 September 2025 00:21:06 +0000 (0:00:00.070) 0:00:26.864 **** 2025-09-27 00:21:07.225530 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:21:07.225543 | orchestrator | 2025-09-27 00:21:07.225567 | orchestrator | TASK [osism.commons.known_hosts : Set file permissions] ************************ 2025-09-27 00:21:07.225580 | orchestrator | Saturday 27 September 2025 00:21:06 +0000 (0:00:00.059) 0:00:26.924 **** 2025-09-27 00:21:07.225593 | orchestrator | changed: [testbed-manager] 2025-09-27 00:21:07.225606 | orchestrator | 2025-09-27 00:21:07.225618 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:21:07.225632 | orchestrator | testbed-manager : ok=31  changed=15  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-09-27 00:21:07.225646 | orchestrator | 2025-09-27 00:21:07.225658 | orchestrator | 2025-09-27 00:21:07.225670 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:21:07.225683 | orchestrator | Saturday 27 September 2025 00:21:06 +0000 (0:00:00.636) 0:00:27.560 **** 2025-09-27 00:21:07.225696 | orchestrator | =============================================================================== 2025-09-27 00:21:07.225709 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with hostname --- 5.87s 2025-09-27 00:21:07.225721 | orchestrator | osism.commons.known_hosts : Run ssh-keyscan for all hosts with ansible_host --- 5.28s 2025-09-27 00:21:07.225734 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.16s 2025-09-27 00:21:07.225747 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.14s 2025-09-27 00:21:07.225757 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.10s 2025-09-27 00:21:07.225768 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.10s 2025-09-27 00:21:07.225779 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.10s 2025-09-27 00:21:07.225790 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.06s 2025-09-27 00:21:07.225801 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.06s 2025-09-27 00:21:07.225812 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.05s 2025-09-27 00:21:07.225822 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.05s 2025-09-27 00:21:07.225833 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.04s 2025-09-27 00:21:07.225844 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.04s 2025-09-27 00:21:07.225863 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.04s 2025-09-27 00:21:07.225874 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.03s 2025-09-27 00:21:07.225885 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries ----------- 1.00s 2025-09-27 00:21:07.225897 | orchestrator | osism.commons.known_hosts : Set file permissions ------------------------ 0.64s 2025-09-27 00:21:07.225907 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with hostname --- 0.18s 2025-09-27 00:21:07.225919 | orchestrator | osism.commons.known_hosts : Write scanned known_hosts entries for all hosts with ansible_host --- 0.17s 2025-09-27 00:21:07.225931 | orchestrator | osism.commons.known_hosts : Write static known_hosts entries ------------ 0.17s 2025-09-27 00:21:07.496517 | orchestrator | + osism apply squid 2025-09-27 00:21:19.535448 | orchestrator | 2025-09-27 00:21:19 | INFO  | Task 3e53147d-a8f0-49d9-9b75-80789582ddcb (squid) was prepared for execution. 2025-09-27 00:21:19.535554 | orchestrator | 2025-09-27 00:21:19 | INFO  | It takes a moment until task 3e53147d-a8f0-49d9-9b75-80789582ddcb (squid) has been started and output is visible here. 2025-09-27 00:23:12.442403 | orchestrator | 2025-09-27 00:23:12.442554 | orchestrator | PLAY [Apply role squid] ******************************************************** 2025-09-27 00:23:12.442571 | orchestrator | 2025-09-27 00:23:12.442583 | orchestrator | TASK [osism.services.squid : Include install tasks] **************************** 2025-09-27 00:23:12.442595 | orchestrator | Saturday 27 September 2025 00:21:23 +0000 (0:00:00.148) 0:00:00.148 **** 2025-09-27 00:23:12.442606 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/squid/tasks/install-Debian-family.yml for testbed-manager 2025-09-27 00:23:12.442650 | orchestrator | 2025-09-27 00:23:12.442661 | orchestrator | TASK [osism.services.squid : Install required packages] ************************ 2025-09-27 00:23:12.442673 | orchestrator | Saturday 27 September 2025 00:21:23 +0000 (0:00:00.111) 0:00:00.259 **** 2025-09-27 00:23:12.442684 | orchestrator | ok: [testbed-manager] 2025-09-27 00:23:12.442697 | orchestrator | 2025-09-27 00:23:12.442708 | orchestrator | TASK [osism.services.squid : Create required directories] ********************** 2025-09-27 00:23:12.442719 | orchestrator | Saturday 27 September 2025 00:21:24 +0000 (0:00:01.177) 0:00:01.436 **** 2025-09-27 00:23:12.442731 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration) 2025-09-27 00:23:12.442742 | orchestrator | changed: [testbed-manager] => (item=/opt/squid/configuration/conf.d) 2025-09-27 00:23:12.442752 | orchestrator | ok: [testbed-manager] => (item=/opt/squid) 2025-09-27 00:23:12.442763 | orchestrator | 2025-09-27 00:23:12.442774 | orchestrator | TASK [osism.services.squid : Copy squid configuration files] ******************* 2025-09-27 00:23:12.442784 | orchestrator | Saturday 27 September 2025 00:21:25 +0000 (0:00:01.135) 0:00:02.572 **** 2025-09-27 00:23:12.442795 | orchestrator | changed: [testbed-manager] => (item=osism.conf) 2025-09-27 00:23:12.442806 | orchestrator | 2025-09-27 00:23:12.442817 | orchestrator | TASK [osism.services.squid : Remove osism_allow_list.conf configuration file] *** 2025-09-27 00:23:12.442828 | orchestrator | Saturday 27 September 2025 00:21:26 +0000 (0:00:01.076) 0:00:03.649 **** 2025-09-27 00:23:12.442838 | orchestrator | ok: [testbed-manager] 2025-09-27 00:23:12.442849 | orchestrator | 2025-09-27 00:23:12.442860 | orchestrator | TASK [osism.services.squid : Copy docker-compose.yml file] ********************* 2025-09-27 00:23:12.442870 | orchestrator | Saturday 27 September 2025 00:21:27 +0000 (0:00:00.371) 0:00:04.021 **** 2025-09-27 00:23:12.442881 | orchestrator | changed: [testbed-manager] 2025-09-27 00:23:12.442892 | orchestrator | 2025-09-27 00:23:12.442903 | orchestrator | TASK [osism.services.squid : Manage squid service] ***************************** 2025-09-27 00:23:12.442916 | orchestrator | Saturday 27 September 2025 00:21:28 +0000 (0:00:00.922) 0:00:04.943 **** 2025-09-27 00:23:12.442929 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage squid service (10 retries left). 2025-09-27 00:23:12.442943 | orchestrator | ok: [testbed-manager] 2025-09-27 00:23:12.442955 | orchestrator | 2025-09-27 00:23:12.442969 | orchestrator | RUNNING HANDLER [osism.services.squid : Restart squid service] ***************** 2025-09-27 00:23:12.442981 | orchestrator | Saturday 27 September 2025 00:21:59 +0000 (0:00:31.367) 0:00:36.311 **** 2025-09-27 00:23:12.442993 | orchestrator | changed: [testbed-manager] 2025-09-27 00:23:12.443005 | orchestrator | 2025-09-27 00:23:12.443018 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for squid service to start] ******* 2025-09-27 00:23:12.443030 | orchestrator | Saturday 27 September 2025 00:22:11 +0000 (0:00:12.054) 0:00:48.365 **** 2025-09-27 00:23:12.443042 | orchestrator | Pausing for 60 seconds 2025-09-27 00:23:12.443054 | orchestrator | changed: [testbed-manager] 2025-09-27 00:23:12.443067 | orchestrator | 2025-09-27 00:23:12.443080 | orchestrator | RUNNING HANDLER [osism.services.squid : Register that squid service was restarted] *** 2025-09-27 00:23:12.443092 | orchestrator | Saturday 27 September 2025 00:23:11 +0000 (0:01:00.077) 0:01:48.443 **** 2025-09-27 00:23:12.443105 | orchestrator | ok: [testbed-manager] 2025-09-27 00:23:12.443117 | orchestrator | 2025-09-27 00:23:12.443130 | orchestrator | RUNNING HANDLER [osism.services.squid : Wait for an healthy squid service] ***** 2025-09-27 00:23:12.443142 | orchestrator | Saturday 27 September 2025 00:23:11 +0000 (0:00:00.057) 0:01:48.500 **** 2025-09-27 00:23:12.443154 | orchestrator | changed: [testbed-manager] 2025-09-27 00:23:12.443166 | orchestrator | 2025-09-27 00:23:12.443179 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:23:12.443217 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:23:12.443230 | orchestrator | 2025-09-27 00:23:12.443242 | orchestrator | 2025-09-27 00:23:12.443255 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:23:12.443275 | orchestrator | Saturday 27 September 2025 00:23:12 +0000 (0:00:00.609) 0:01:49.110 **** 2025-09-27 00:23:12.443286 | orchestrator | =============================================================================== 2025-09-27 00:23:12.443296 | orchestrator | osism.services.squid : Wait for squid service to start ----------------- 60.08s 2025-09-27 00:23:12.443307 | orchestrator | osism.services.squid : Manage squid service ---------------------------- 31.37s 2025-09-27 00:23:12.443318 | orchestrator | osism.services.squid : Restart squid service --------------------------- 12.05s 2025-09-27 00:23:12.443329 | orchestrator | osism.services.squid : Install required packages ------------------------ 1.18s 2025-09-27 00:23:12.443340 | orchestrator | osism.services.squid : Create required directories ---------------------- 1.14s 2025-09-27 00:23:12.443350 | orchestrator | osism.services.squid : Copy squid configuration files ------------------- 1.08s 2025-09-27 00:23:12.443361 | orchestrator | osism.services.squid : Copy docker-compose.yml file --------------------- 0.92s 2025-09-27 00:23:12.443371 | orchestrator | osism.services.squid : Wait for an healthy squid service ---------------- 0.61s 2025-09-27 00:23:12.443382 | orchestrator | osism.services.squid : Remove osism_allow_list.conf configuration file --- 0.37s 2025-09-27 00:23:12.443393 | orchestrator | osism.services.squid : Include install tasks ---------------------------- 0.11s 2025-09-27 00:23:12.443403 | orchestrator | osism.services.squid : Register that squid service was restarted -------- 0.06s 2025-09-27 00:23:12.697510 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2025-09-27 00:23:12.697790 | orchestrator | ++ semver latest 9.0.0 2025-09-27 00:23:12.747304 | orchestrator | + [[ -1 -lt 0 ]] 2025-09-27 00:23:12.747330 | orchestrator | + [[ latest != \l\a\t\e\s\t ]] 2025-09-27 00:23:12.748079 | orchestrator | + osism apply operator -u ubuntu -l testbed-nodes 2025-09-27 00:23:24.645048 | orchestrator | 2025-09-27 00:23:24 | INFO  | Task 4023eea9-033f-4853-9f0c-82294e25101e (operator) was prepared for execution. 2025-09-27 00:23:24.645170 | orchestrator | 2025-09-27 00:23:24 | INFO  | It takes a moment until task 4023eea9-033f-4853-9f0c-82294e25101e (operator) has been started and output is visible here. 2025-09-27 00:23:39.777338 | orchestrator | 2025-09-27 00:23:39.777455 | orchestrator | PLAY [Make ssh pipelining working] ********************************************* 2025-09-27 00:23:39.777472 | orchestrator | 2025-09-27 00:23:39.777484 | orchestrator | TASK [Gathering Facts] ********************************************************* 2025-09-27 00:23:39.777496 | orchestrator | Saturday 27 September 2025 00:23:28 +0000 (0:00:00.148) 0:00:00.148 **** 2025-09-27 00:23:39.777507 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:23:39.777519 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:23:39.777531 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:23:39.777541 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:23:39.777552 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:23:39.777563 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:23:39.777574 | orchestrator | 2025-09-27 00:23:39.777585 | orchestrator | TASK [Do not require tty for all users] **************************************** 2025-09-27 00:23:39.777596 | orchestrator | Saturday 27 September 2025 00:23:31 +0000 (0:00:03.367) 0:00:03.516 **** 2025-09-27 00:23:39.777607 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:23:39.777618 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:23:39.777630 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:23:39.777641 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:23:39.777652 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:23:39.777663 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:23:39.777674 | orchestrator | 2025-09-27 00:23:39.777688 | orchestrator | PLAY [Apply role operator] ***************************************************** 2025-09-27 00:23:39.777700 | orchestrator | 2025-09-27 00:23:39.777711 | orchestrator | TASK [osism.commons.operator : Gather variables for each operating system] ***** 2025-09-27 00:23:39.777722 | orchestrator | Saturday 27 September 2025 00:23:32 +0000 (0:00:00.839) 0:00:04.355 **** 2025-09-27 00:23:39.777733 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:23:39.777744 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:23:39.777755 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:23:39.777792 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:23:39.777804 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:23:39.777815 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:23:39.777825 | orchestrator | 2025-09-27 00:23:39.777836 | orchestrator | TASK [osism.commons.operator : Set operator_groups variable to default value] *** 2025-09-27 00:23:39.777847 | orchestrator | Saturday 27 September 2025 00:23:32 +0000 (0:00:00.149) 0:00:04.504 **** 2025-09-27 00:23:39.777858 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:23:39.777869 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:23:39.777879 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:23:39.777890 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:23:39.777901 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:23:39.777911 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:23:39.777922 | orchestrator | 2025-09-27 00:23:39.777933 | orchestrator | TASK [osism.commons.operator : Create operator group] ************************** 2025-09-27 00:23:39.777944 | orchestrator | Saturday 27 September 2025 00:23:32 +0000 (0:00:00.146) 0:00:04.651 **** 2025-09-27 00:23:39.777955 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:23:39.777967 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:23:39.777978 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:23:39.778007 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:23:39.778075 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:23:39.778089 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:23:39.778100 | orchestrator | 2025-09-27 00:23:39.778111 | orchestrator | TASK [osism.commons.operator : Create user] ************************************ 2025-09-27 00:23:39.778122 | orchestrator | Saturday 27 September 2025 00:23:33 +0000 (0:00:00.641) 0:00:05.293 **** 2025-09-27 00:23:39.778133 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:23:39.778144 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:23:39.778155 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:23:39.778165 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:23:39.778176 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:23:39.778208 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:23:39.778219 | orchestrator | 2025-09-27 00:23:39.778230 | orchestrator | TASK [osism.commons.operator : Add user to additional groups] ****************** 2025-09-27 00:23:39.778241 | orchestrator | Saturday 27 September 2025 00:23:34 +0000 (0:00:00.800) 0:00:06.094 **** 2025-09-27 00:23:39.778252 | orchestrator | changed: [testbed-node-1] => (item=adm) 2025-09-27 00:23:39.778269 | orchestrator | changed: [testbed-node-2] => (item=adm) 2025-09-27 00:23:39.778280 | orchestrator | changed: [testbed-node-0] => (item=adm) 2025-09-27 00:23:39.778291 | orchestrator | changed: [testbed-node-3] => (item=adm) 2025-09-27 00:23:39.778301 | orchestrator | changed: [testbed-node-5] => (item=adm) 2025-09-27 00:23:39.778312 | orchestrator | changed: [testbed-node-4] => (item=adm) 2025-09-27 00:23:39.778323 | orchestrator | changed: [testbed-node-1] => (item=sudo) 2025-09-27 00:23:39.778334 | orchestrator | changed: [testbed-node-2] => (item=sudo) 2025-09-27 00:23:39.778345 | orchestrator | changed: [testbed-node-0] => (item=sudo) 2025-09-27 00:23:39.778355 | orchestrator | changed: [testbed-node-5] => (item=sudo) 2025-09-27 00:23:39.778366 | orchestrator | changed: [testbed-node-3] => (item=sudo) 2025-09-27 00:23:39.778377 | orchestrator | changed: [testbed-node-4] => (item=sudo) 2025-09-27 00:23:39.778388 | orchestrator | 2025-09-27 00:23:39.778399 | orchestrator | TASK [osism.commons.operator : Copy user sudoers file] ************************* 2025-09-27 00:23:39.778409 | orchestrator | Saturday 27 September 2025 00:23:35 +0000 (0:00:01.088) 0:00:07.183 **** 2025-09-27 00:23:39.778420 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:23:39.778431 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:23:39.778441 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:23:39.778452 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:23:39.778463 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:23:39.778473 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:23:39.778484 | orchestrator | 2025-09-27 00:23:39.778495 | orchestrator | TASK [osism.commons.operator : Set language variables in .bashrc configuration file] *** 2025-09-27 00:23:39.778515 | orchestrator | Saturday 27 September 2025 00:23:36 +0000 (0:00:01.119) 0:00:08.302 **** 2025-09-27 00:23:39.778526 | orchestrator | [WARNING]: Module remote_tmp /root/.ansible/tmp did not exist and was created 2025-09-27 00:23:39.778537 | orchestrator | with a mode of 0700, this may cause issues when running as another user. To 2025-09-27 00:23:39.778548 | orchestrator | avoid this, create the remote_tmp dir with the correct permissions manually 2025-09-27 00:23:39.778559 | orchestrator | changed: [testbed-node-1] => (item=export LANGUAGE=C.UTF-8) 2025-09-27 00:23:39.778588 | orchestrator | changed: [testbed-node-5] => (item=export LANGUAGE=C.UTF-8) 2025-09-27 00:23:39.778599 | orchestrator | changed: [testbed-node-0] => (item=export LANGUAGE=C.UTF-8) 2025-09-27 00:23:39.778610 | orchestrator | changed: [testbed-node-2] => (item=export LANGUAGE=C.UTF-8) 2025-09-27 00:23:39.778621 | orchestrator | changed: [testbed-node-3] => (item=export LANGUAGE=C.UTF-8) 2025-09-27 00:23:39.778632 | orchestrator | changed: [testbed-node-4] => (item=export LANGUAGE=C.UTF-8) 2025-09-27 00:23:39.778642 | orchestrator | changed: [testbed-node-1] => (item=export LANG=C.UTF-8) 2025-09-27 00:23:39.778653 | orchestrator | changed: [testbed-node-5] => (item=export LANG=C.UTF-8) 2025-09-27 00:23:39.778664 | orchestrator | changed: [testbed-node-0] => (item=export LANG=C.UTF-8) 2025-09-27 00:23:39.778675 | orchestrator | changed: [testbed-node-2] => (item=export LANG=C.UTF-8) 2025-09-27 00:23:39.778686 | orchestrator | changed: [testbed-node-4] => (item=export LANG=C.UTF-8) 2025-09-27 00:23:39.778696 | orchestrator | changed: [testbed-node-3] => (item=export LANG=C.UTF-8) 2025-09-27 00:23:39.778707 | orchestrator | changed: [testbed-node-0] => (item=export LC_ALL=C.UTF-8) 2025-09-27 00:23:39.778718 | orchestrator | changed: [testbed-node-1] => (item=export LC_ALL=C.UTF-8) 2025-09-27 00:23:39.778729 | orchestrator | changed: [testbed-node-5] => (item=export LC_ALL=C.UTF-8) 2025-09-27 00:23:39.778740 | orchestrator | changed: [testbed-node-2] => (item=export LC_ALL=C.UTF-8) 2025-09-27 00:23:39.778751 | orchestrator | changed: [testbed-node-4] => (item=export LC_ALL=C.UTF-8) 2025-09-27 00:23:39.778762 | orchestrator | changed: [testbed-node-3] => (item=export LC_ALL=C.UTF-8) 2025-09-27 00:23:39.778773 | orchestrator | 2025-09-27 00:23:39.778784 | orchestrator | TASK [osism.commons.operator : Set custom environment variables in .bashrc configuration file] *** 2025-09-27 00:23:39.778795 | orchestrator | Saturday 27 September 2025 00:23:37 +0000 (0:00:01.215) 0:00:09.519 **** 2025-09-27 00:23:39.778806 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:23:39.778817 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:23:39.778828 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:23:39.778839 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:23:39.778850 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:23:39.778860 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:23:39.778871 | orchestrator | 2025-09-27 00:23:39.778882 | orchestrator | TASK [osism.commons.operator : Create .ssh directory] ************************** 2025-09-27 00:23:39.778893 | orchestrator | Saturday 27 September 2025 00:23:37 +0000 (0:00:00.151) 0:00:09.670 **** 2025-09-27 00:23:39.778904 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:23:39.778914 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:23:39.778925 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:23:39.778936 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:23:39.778947 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:23:39.778958 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:23:39.778969 | orchestrator | 2025-09-27 00:23:39.778980 | orchestrator | TASK [osism.commons.operator : Check number of SSH authorized keys] ************ 2025-09-27 00:23:39.778991 | orchestrator | Saturday 27 September 2025 00:23:38 +0000 (0:00:00.543) 0:00:10.213 **** 2025-09-27 00:23:39.779002 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:23:39.779012 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:23:39.779023 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:23:39.779034 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:23:39.779051 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:23:39.779062 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:23:39.779073 | orchestrator | 2025-09-27 00:23:39.779084 | orchestrator | TASK [osism.commons.operator : Set ssh authorized keys] ************************ 2025-09-27 00:23:39.779095 | orchestrator | Saturday 27 September 2025 00:23:38 +0000 (0:00:00.154) 0:00:10.367 **** 2025-09-27 00:23:39.779106 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-27 00:23:39.779117 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-27 00:23:39.779127 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:23:39.779138 | orchestrator | changed: [testbed-node-2] => (item=None) 2025-09-27 00:23:39.779149 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:23:39.779160 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:23:39.779171 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-27 00:23:39.779208 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:23:39.779220 | orchestrator | changed: [testbed-node-1] => (item=None) 2025-09-27 00:23:39.779230 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:23:39.779241 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-27 00:23:39.779252 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:23:39.779262 | orchestrator | 2025-09-27 00:23:39.779273 | orchestrator | TASK [osism.commons.operator : Delete ssh authorized keys] ********************* 2025-09-27 00:23:39.779284 | orchestrator | Saturday 27 September 2025 00:23:39 +0000 (0:00:00.661) 0:00:11.029 **** 2025-09-27 00:23:39.779295 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:23:39.779305 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:23:39.779316 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:23:39.779327 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:23:39.779337 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:23:39.779348 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:23:39.779359 | orchestrator | 2025-09-27 00:23:39.779369 | orchestrator | TASK [osism.commons.operator : Set authorized GitHub accounts] ***************** 2025-09-27 00:23:39.779380 | orchestrator | Saturday 27 September 2025 00:23:39 +0000 (0:00:00.140) 0:00:11.170 **** 2025-09-27 00:23:39.779391 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:23:39.779402 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:23:39.779413 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:23:39.779423 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:23:39.779434 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:23:39.779444 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:23:39.779455 | orchestrator | 2025-09-27 00:23:39.779466 | orchestrator | TASK [osism.commons.operator : Delete authorized GitHub accounts] ************** 2025-09-27 00:23:39.779477 | orchestrator | Saturday 27 September 2025 00:23:39 +0000 (0:00:00.142) 0:00:11.313 **** 2025-09-27 00:23:39.779488 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:23:39.779498 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:23:39.779509 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:23:39.779520 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:23:39.779538 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:23:40.790737 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:23:40.790834 | orchestrator | 2025-09-27 00:23:40.790848 | orchestrator | TASK [osism.commons.operator : Set password] *********************************** 2025-09-27 00:23:40.790862 | orchestrator | Saturday 27 September 2025 00:23:39 +0000 (0:00:00.146) 0:00:11.459 **** 2025-09-27 00:23:40.790873 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:23:40.790884 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:23:40.790895 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:23:40.790906 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:23:40.790917 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:23:40.790927 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:23:40.790939 | orchestrator | 2025-09-27 00:23:40.790950 | orchestrator | TASK [osism.commons.operator : Unset & lock password] ************************** 2025-09-27 00:23:40.790962 | orchestrator | Saturday 27 September 2025 00:23:40 +0000 (0:00:00.591) 0:00:12.051 **** 2025-09-27 00:23:40.790997 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:23:40.791008 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:23:40.791019 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:23:40.791030 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:23:40.791041 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:23:40.791052 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:23:40.791063 | orchestrator | 2025-09-27 00:23:40.791074 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:23:40.791086 | orchestrator | testbed-node-0 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:23:40.791099 | orchestrator | testbed-node-1 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:23:40.791110 | orchestrator | testbed-node-2 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:23:40.791121 | orchestrator | testbed-node-3 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:23:40.791131 | orchestrator | testbed-node-4 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:23:40.791157 | orchestrator | testbed-node-5 : ok=12  changed=8  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:23:40.791169 | orchestrator | 2025-09-27 00:23:40.791180 | orchestrator | 2025-09-27 00:23:40.791240 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:23:40.791251 | orchestrator | Saturday 27 September 2025 00:23:40 +0000 (0:00:00.205) 0:00:12.256 **** 2025-09-27 00:23:40.791262 | orchestrator | =============================================================================== 2025-09-27 00:23:40.791273 | orchestrator | Gathering Facts --------------------------------------------------------- 3.37s 2025-09-27 00:23:40.791285 | orchestrator | osism.commons.operator : Set language variables in .bashrc configuration file --- 1.22s 2025-09-27 00:23:40.791298 | orchestrator | osism.commons.operator : Copy user sudoers file ------------------------- 1.12s 2025-09-27 00:23:40.791310 | orchestrator | osism.commons.operator : Add user to additional groups ------------------ 1.09s 2025-09-27 00:23:40.791323 | orchestrator | Do not require tty for all users ---------------------------------------- 0.84s 2025-09-27 00:23:40.791340 | orchestrator | osism.commons.operator : Create user ------------------------------------ 0.80s 2025-09-27 00:23:40.791354 | orchestrator | osism.commons.operator : Set ssh authorized keys ------------------------ 0.66s 2025-09-27 00:23:40.791367 | orchestrator | osism.commons.operator : Create operator group -------------------------- 0.64s 2025-09-27 00:23:40.791379 | orchestrator | osism.commons.operator : Set password ----------------------------------- 0.59s 2025-09-27 00:23:40.791392 | orchestrator | osism.commons.operator : Create .ssh directory -------------------------- 0.54s 2025-09-27 00:23:40.791404 | orchestrator | osism.commons.operator : Unset & lock password -------------------------- 0.21s 2025-09-27 00:23:40.791416 | orchestrator | osism.commons.operator : Check number of SSH authorized keys ------------ 0.15s 2025-09-27 00:23:40.791428 | orchestrator | osism.commons.operator : Set custom environment variables in .bashrc configuration file --- 0.15s 2025-09-27 00:23:40.791441 | orchestrator | osism.commons.operator : Gather variables for each operating system ----- 0.15s 2025-09-27 00:23:40.791454 | orchestrator | osism.commons.operator : Set operator_groups variable to default value --- 0.15s 2025-09-27 00:23:40.791466 | orchestrator | osism.commons.operator : Delete authorized GitHub accounts -------------- 0.15s 2025-09-27 00:23:40.791476 | orchestrator | osism.commons.operator : Set authorized GitHub accounts ----------------- 0.14s 2025-09-27 00:23:40.791487 | orchestrator | osism.commons.operator : Delete ssh authorized keys --------------------- 0.14s 2025-09-27 00:23:41.067060 | orchestrator | + osism apply --environment custom facts 2025-09-27 00:23:42.882597 | orchestrator | 2025-09-27 00:23:42 | INFO  | Trying to run play facts in environment custom 2025-09-27 00:23:53.003399 | orchestrator | 2025-09-27 00:23:52 | INFO  | Task 9a1a1c59-9595-4994-9a78-92329c947841 (facts) was prepared for execution. 2025-09-27 00:23:53.003507 | orchestrator | 2025-09-27 00:23:52 | INFO  | It takes a moment until task 9a1a1c59-9595-4994-9a78-92329c947841 (facts) has been started and output is visible here. 2025-09-27 00:24:37.353774 | orchestrator | 2025-09-27 00:24:37.353890 | orchestrator | PLAY [Copy custom network devices fact] **************************************** 2025-09-27 00:24:37.353907 | orchestrator | 2025-09-27 00:24:37.353918 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-09-27 00:24:37.353930 | orchestrator | Saturday 27 September 2025 00:23:56 +0000 (0:00:00.083) 0:00:00.083 **** 2025-09-27 00:24:37.353941 | orchestrator | ok: [testbed-manager] 2025-09-27 00:24:37.353953 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:24:37.353964 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:24:37.353976 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:24:37.353986 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:24:37.353997 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:24:37.354007 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:24:37.354074 | orchestrator | 2025-09-27 00:24:37.354089 | orchestrator | TASK [Copy fact file] ********************************************************** 2025-09-27 00:24:37.354100 | orchestrator | Saturday 27 September 2025 00:23:58 +0000 (0:00:01.350) 0:00:01.433 **** 2025-09-27 00:24:37.354111 | orchestrator | ok: [testbed-manager] 2025-09-27 00:24:37.354122 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:24:37.354133 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:24:37.354144 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:24:37.354155 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:24:37.354166 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:24:37.354177 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:24:37.354230 | orchestrator | 2025-09-27 00:24:37.354242 | orchestrator | PLAY [Copy custom ceph devices facts] ****************************************** 2025-09-27 00:24:37.354253 | orchestrator | 2025-09-27 00:24:37.354264 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-09-27 00:24:37.354275 | orchestrator | Saturday 27 September 2025 00:23:59 +0000 (0:00:01.112) 0:00:02.546 **** 2025-09-27 00:24:37.354288 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:24:37.354300 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:24:37.354313 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:24:37.354325 | orchestrator | 2025-09-27 00:24:37.354338 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-09-27 00:24:37.354352 | orchestrator | Saturday 27 September 2025 00:23:59 +0000 (0:00:00.127) 0:00:02.674 **** 2025-09-27 00:24:37.354364 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:24:37.354376 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:24:37.354389 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:24:37.354401 | orchestrator | 2025-09-27 00:24:37.354413 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-09-27 00:24:37.354426 | orchestrator | Saturday 27 September 2025 00:23:59 +0000 (0:00:00.200) 0:00:02.874 **** 2025-09-27 00:24:37.354439 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:24:37.354452 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:24:37.354464 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:24:37.354476 | orchestrator | 2025-09-27 00:24:37.354489 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-09-27 00:24:37.354502 | orchestrator | Saturday 27 September 2025 00:23:59 +0000 (0:00:00.194) 0:00:03.068 **** 2025-09-27 00:24:37.354515 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:24:37.354529 | orchestrator | 2025-09-27 00:24:37.354541 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-09-27 00:24:37.354582 | orchestrator | Saturday 27 September 2025 00:23:59 +0000 (0:00:00.132) 0:00:03.201 **** 2025-09-27 00:24:37.354595 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:24:37.354608 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:24:37.354620 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:24:37.354632 | orchestrator | 2025-09-27 00:24:37.354645 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-09-27 00:24:37.354657 | orchestrator | Saturday 27 September 2025 00:24:00 +0000 (0:00:00.388) 0:00:03.590 **** 2025-09-27 00:24:37.354667 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:24:37.354692 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:24:37.354704 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:24:37.354714 | orchestrator | 2025-09-27 00:24:37.354725 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-09-27 00:24:37.354736 | orchestrator | Saturday 27 September 2025 00:24:00 +0000 (0:00:00.129) 0:00:03.719 **** 2025-09-27 00:24:37.354747 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:24:37.354757 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:24:37.354768 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:24:37.354779 | orchestrator | 2025-09-27 00:24:37.354790 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-09-27 00:24:37.354800 | orchestrator | Saturday 27 September 2025 00:24:01 +0000 (0:00:00.993) 0:00:04.713 **** 2025-09-27 00:24:37.354811 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:24:37.354822 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:24:37.354832 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:24:37.354843 | orchestrator | 2025-09-27 00:24:37.354854 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-09-27 00:24:37.354864 | orchestrator | Saturday 27 September 2025 00:24:01 +0000 (0:00:00.415) 0:00:05.128 **** 2025-09-27 00:24:37.354875 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:24:37.354886 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:24:37.354897 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:24:37.354907 | orchestrator | 2025-09-27 00:24:37.354918 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-09-27 00:24:37.354929 | orchestrator | Saturday 27 September 2025 00:24:02 +0000 (0:00:00.926) 0:00:06.055 **** 2025-09-27 00:24:37.354939 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:24:37.354950 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:24:37.354960 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:24:37.354971 | orchestrator | 2025-09-27 00:24:37.354982 | orchestrator | TASK [Install required packages (RedHat)] ************************************** 2025-09-27 00:24:37.354993 | orchestrator | Saturday 27 September 2025 00:24:20 +0000 (0:00:17.895) 0:00:23.950 **** 2025-09-27 00:24:37.355003 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:24:37.355014 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:24:37.355025 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:24:37.355035 | orchestrator | 2025-09-27 00:24:37.355046 | orchestrator | TASK [Install required packages (Debian)] ************************************** 2025-09-27 00:24:37.355076 | orchestrator | Saturday 27 September 2025 00:24:20 +0000 (0:00:00.094) 0:00:24.044 **** 2025-09-27 00:24:37.355088 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:24:37.355099 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:24:37.355110 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:24:37.355120 | orchestrator | 2025-09-27 00:24:37.355131 | orchestrator | TASK [Create custom facts directory] ******************************************* 2025-09-27 00:24:37.355142 | orchestrator | Saturday 27 September 2025 00:24:28 +0000 (0:00:07.783) 0:00:31.828 **** 2025-09-27 00:24:37.355153 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:24:37.355163 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:24:37.355174 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:24:37.355202 | orchestrator | 2025-09-27 00:24:37.355214 | orchestrator | TASK [Copy fact files] ********************************************************* 2025-09-27 00:24:37.355225 | orchestrator | Saturday 27 September 2025 00:24:28 +0000 (0:00:00.407) 0:00:32.236 **** 2025-09-27 00:24:37.355244 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices) 2025-09-27 00:24:37.355255 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices) 2025-09-27 00:24:37.355266 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices) 2025-09-27 00:24:37.355277 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_devices_all) 2025-09-27 00:24:37.355287 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_devices_all) 2025-09-27 00:24:37.355298 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_devices_all) 2025-09-27 00:24:37.355308 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices) 2025-09-27 00:24:37.355319 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices) 2025-09-27 00:24:37.355329 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices) 2025-09-27 00:24:37.355340 | orchestrator | changed: [testbed-node-5] => (item=testbed_ceph_osd_devices_all) 2025-09-27 00:24:37.355351 | orchestrator | changed: [testbed-node-3] => (item=testbed_ceph_osd_devices_all) 2025-09-27 00:24:37.355361 | orchestrator | changed: [testbed-node-4] => (item=testbed_ceph_osd_devices_all) 2025-09-27 00:24:37.355372 | orchestrator | 2025-09-27 00:24:37.355383 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-09-27 00:24:37.355393 | orchestrator | Saturday 27 September 2025 00:24:32 +0000 (0:00:03.403) 0:00:35.639 **** 2025-09-27 00:24:37.355404 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:24:37.355414 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:24:37.355425 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:24:37.355436 | orchestrator | 2025-09-27 00:24:37.355446 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-27 00:24:37.355457 | orchestrator | 2025-09-27 00:24:37.355468 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-27 00:24:37.355478 | orchestrator | Saturday 27 September 2025 00:24:33 +0000 (0:00:01.308) 0:00:36.948 **** 2025-09-27 00:24:37.355489 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:24:37.355500 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:24:37.355510 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:24:37.355521 | orchestrator | ok: [testbed-manager] 2025-09-27 00:24:37.355531 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:24:37.355542 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:24:37.355552 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:24:37.355563 | orchestrator | 2025-09-27 00:24:37.355574 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:24:37.355585 | orchestrator | testbed-manager : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:24:37.355596 | orchestrator | testbed-node-0 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:24:37.355609 | orchestrator | testbed-node-1 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:24:37.355620 | orchestrator | testbed-node-2 : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:24:37.355667 | orchestrator | testbed-node-3 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:24:37.355680 | orchestrator | testbed-node-4 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:24:37.355691 | orchestrator | testbed-node-5 : ok=16  changed=7  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:24:37.355701 | orchestrator | 2025-09-27 00:24:37.355712 | orchestrator | 2025-09-27 00:24:37.355723 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:24:37.355740 | orchestrator | Saturday 27 September 2025 00:24:37 +0000 (0:00:03.704) 0:00:40.652 **** 2025-09-27 00:24:37.355751 | orchestrator | =============================================================================== 2025-09-27 00:24:37.355762 | orchestrator | osism.commons.repository : Update package cache ------------------------ 17.90s 2025-09-27 00:24:37.355772 | orchestrator | Install required packages (Debian) -------------------------------------- 7.78s 2025-09-27 00:24:37.355783 | orchestrator | Gathers facts about hosts ----------------------------------------------- 3.70s 2025-09-27 00:24:37.355794 | orchestrator | Copy fact files --------------------------------------------------------- 3.40s 2025-09-27 00:24:37.355804 | orchestrator | Create custom facts directory ------------------------------------------- 1.35s 2025-09-27 00:24:37.355815 | orchestrator | osism.commons.repository : Force update of package cache ---------------- 1.31s 2025-09-27 00:24:37.355832 | orchestrator | Copy fact file ---------------------------------------------------------- 1.11s 2025-09-27 00:24:37.571540 | orchestrator | osism.commons.repository : Copy 99osism apt configuration --------------- 0.99s 2025-09-27 00:24:37.571604 | orchestrator | osism.commons.repository : Copy ubuntu.sources file --------------------- 0.93s 2025-09-27 00:24:37.571616 | orchestrator | osism.commons.repository : Remove sources.list file --------------------- 0.42s 2025-09-27 00:24:37.571627 | orchestrator | Create custom facts directory ------------------------------------------- 0.41s 2025-09-27 00:24:37.571638 | orchestrator | osism.commons.repository : Create /etc/apt/sources.list.d directory ----- 0.39s 2025-09-27 00:24:37.571649 | orchestrator | osism.commons.repository : Set repository_default fact to default value --- 0.20s 2025-09-27 00:24:37.571660 | orchestrator | osism.commons.repository : Set repositories to default ------------------ 0.19s 2025-09-27 00:24:37.571670 | orchestrator | osism.commons.repository : Include distribution specific repository tasks --- 0.13s 2025-09-27 00:24:37.571681 | orchestrator | osism.commons.repository : Include tasks for Ubuntu < 24.04 ------------- 0.13s 2025-09-27 00:24:37.571692 | orchestrator | osism.commons.repository : Gather variables for each operating system --- 0.13s 2025-09-27 00:24:37.571703 | orchestrator | Install required packages (RedHat) -------------------------------------- 0.09s 2025-09-27 00:24:37.868361 | orchestrator | + osism apply bootstrap 2025-09-27 00:24:49.899032 | orchestrator | 2025-09-27 00:24:49 | INFO  | Task 7198dbb5-163c-46ab-bd07-d63caa706455 (bootstrap) was prepared for execution. 2025-09-27 00:24:49.899131 | orchestrator | 2025-09-27 00:24:49 | INFO  | It takes a moment until task 7198dbb5-163c-46ab-bd07-d63caa706455 (bootstrap) has been started and output is visible here. 2025-09-27 00:25:05.286291 | orchestrator | 2025-09-27 00:25:05.286392 | orchestrator | PLAY [Group hosts based on state bootstrap] ************************************ 2025-09-27 00:25:05.286409 | orchestrator | 2025-09-27 00:25:05.286421 | orchestrator | TASK [Group hosts based on state bootstrap] ************************************ 2025-09-27 00:25:05.286433 | orchestrator | Saturday 27 September 2025 00:24:53 +0000 (0:00:00.146) 0:00:00.146 **** 2025-09-27 00:25:05.286444 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:05.286456 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:05.286467 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:05.286478 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:05.286489 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:05.286500 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:05.286511 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:05.286522 | orchestrator | 2025-09-27 00:25:05.286533 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-27 00:25:05.286544 | orchestrator | 2025-09-27 00:25:05.286555 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-27 00:25:05.286566 | orchestrator | Saturday 27 September 2025 00:24:54 +0000 (0:00:00.209) 0:00:00.356 **** 2025-09-27 00:25:05.286576 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:05.286587 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:05.286598 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:05.286608 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:05.286638 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:05.286649 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:05.286660 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:05.286671 | orchestrator | 2025-09-27 00:25:05.286682 | orchestrator | PLAY [Gather facts for all hosts (if using --limit)] *************************** 2025-09-27 00:25:05.286692 | orchestrator | 2025-09-27 00:25:05.286703 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-27 00:25:05.286714 | orchestrator | Saturday 27 September 2025 00:24:57 +0000 (0:00:03.688) 0:00:04.044 **** 2025-09-27 00:25:05.286725 | orchestrator | skipping: [testbed-manager] => (item=testbed-manager)  2025-09-27 00:25:05.286751 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-3)  2025-09-27 00:25:05.286762 | orchestrator | skipping: [testbed-node-4] => (item=testbed-manager)  2025-09-27 00:25:05.286773 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-4)  2025-09-27 00:25:05.286784 | orchestrator | skipping: [testbed-node-3] => (item=testbed-manager)  2025-09-27 00:25:05.286794 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-5)  2025-09-27 00:25:05.286806 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-3)  2025-09-27 00:25:05.286818 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:25:05.286832 | orchestrator | skipping: [testbed-node-5] => (item=testbed-manager)  2025-09-27 00:25:05.286845 | orchestrator | skipping: [testbed-node-0] => (item=testbed-manager)  2025-09-27 00:25:05.286858 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-0)  2025-09-27 00:25:05.286871 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-4)  2025-09-27 00:25:05.286885 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:25:05.286897 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-5)  2025-09-27 00:25:05.286910 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-3)  2025-09-27 00:25:05.286923 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2025-09-27 00:25:05.286935 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-1)  2025-09-27 00:25:05.286948 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2025-09-27 00:25:05.286960 | orchestrator | skipping: [testbed-node-1] => (item=testbed-manager)  2025-09-27 00:25:05.286973 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:25:05.286986 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-4)  2025-09-27 00:25:05.286998 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2025-09-27 00:25:05.287011 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2025-09-27 00:25:05.287024 | orchestrator | skipping: [testbed-manager] => (item=testbed-node-2)  2025-09-27 00:25:05.287037 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2025-09-27 00:25:05.287049 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-3)  2025-09-27 00:25:05.287062 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:05.287074 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-5)  2025-09-27 00:25:05.287087 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:25:05.287100 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-27 00:25:05.287112 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2025-09-27 00:25:05.287125 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-4)  2025-09-27 00:25:05.287137 | orchestrator | skipping: [testbed-node-2] => (item=testbed-manager)  2025-09-27 00:25:05.287150 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-27 00:25:05.287162 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-27 00:25:05.287174 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-5)  2025-09-27 00:25:05.287221 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2025-09-27 00:25:05.287234 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-3)  2025-09-27 00:25:05.287245 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-27 00:25:05.287263 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2025-09-27 00:25:05.287274 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-4)  2025-09-27 00:25:05.287285 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-27 00:25:05.287296 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:25:05.287307 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2025-09-27 00:25:05.287317 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2025-09-27 00:25:05.287328 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-27 00:25:05.287356 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:25:05.287367 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-5)  2025-09-27 00:25:05.287378 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2025-09-27 00:25:05.287389 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:25:05.287400 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2025-09-27 00:25:05.287411 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:25:05.287421 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2025-09-27 00:25:05.287432 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2025-09-27 00:25:05.287443 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2025-09-27 00:25:05.287454 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:25:05.287465 | orchestrator | 2025-09-27 00:25:05.287476 | orchestrator | PLAY [Apply bootstrap roles part 1] ******************************************** 2025-09-27 00:25:05.287486 | orchestrator | 2025-09-27 00:25:05.287497 | orchestrator | TASK [osism.commons.hostname : Set hostname] *********************************** 2025-09-27 00:25:05.287508 | orchestrator | Saturday 27 September 2025 00:24:58 +0000 (0:00:00.385) 0:00:04.430 **** 2025-09-27 00:25:05.287519 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:05.287530 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:05.287541 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:05.287551 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:05.287562 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:05.287573 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:05.287584 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:05.287594 | orchestrator | 2025-09-27 00:25:05.287605 | orchestrator | TASK [osism.commons.hostname : Copy /etc/hostname] ***************************** 2025-09-27 00:25:05.287629 | orchestrator | Saturday 27 September 2025 00:24:59 +0000 (0:00:01.140) 0:00:05.571 **** 2025-09-27 00:25:05.287641 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:05.287651 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:05.287672 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:05.287683 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:05.287694 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:05.287704 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:05.287715 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:05.287726 | orchestrator | 2025-09-27 00:25:05.287737 | orchestrator | TASK [osism.commons.hosts : Include type specific tasks] *********************** 2025-09-27 00:25:05.287748 | orchestrator | Saturday 27 September 2025 00:25:00 +0000 (0:00:01.179) 0:00:06.750 **** 2025-09-27 00:25:05.287759 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/hosts/tasks/type-template.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:05.287771 | orchestrator | 2025-09-27 00:25:05.287782 | orchestrator | TASK [osism.commons.hosts : Copy /etc/hosts file] ****************************** 2025-09-27 00:25:05.287793 | orchestrator | Saturday 27 September 2025 00:25:00 +0000 (0:00:00.292) 0:00:07.043 **** 2025-09-27 00:25:05.287804 | orchestrator | changed: [testbed-manager] 2025-09-27 00:25:05.287815 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:05.287825 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:05.287836 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:05.287847 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:05.287864 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:05.287875 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:05.287885 | orchestrator | 2025-09-27 00:25:05.287896 | orchestrator | TASK [osism.commons.proxy : Include distribution specific tasks] *************** 2025-09-27 00:25:05.287907 | orchestrator | Saturday 27 September 2025 00:25:02 +0000 (0:00:02.007) 0:00:09.051 **** 2025-09-27 00:25:05.287917 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:05.287929 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/proxy/tasks/Debian-family.yml for testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:05.287941 | orchestrator | 2025-09-27 00:25:05.287951 | orchestrator | TASK [osism.commons.proxy : Configure proxy parameters for apt] **************** 2025-09-27 00:25:05.287962 | orchestrator | Saturday 27 September 2025 00:25:03 +0000 (0:00:00.272) 0:00:09.324 **** 2025-09-27 00:25:05.287973 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:05.287983 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:05.287994 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:05.288004 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:05.288015 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:05.288026 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:05.288036 | orchestrator | 2025-09-27 00:25:05.288047 | orchestrator | TASK [osism.commons.proxy : Set system wide settings in environment file] ****** 2025-09-27 00:25:05.288058 | orchestrator | Saturday 27 September 2025 00:25:04 +0000 (0:00:01.077) 0:00:10.401 **** 2025-09-27 00:25:05.288068 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:05.288079 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:05.288090 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:05.288100 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:05.288110 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:05.288121 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:05.288132 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:05.288143 | orchestrator | 2025-09-27 00:25:05.288153 | orchestrator | TASK [osism.commons.proxy : Remove system wide settings in environment file] *** 2025-09-27 00:25:05.288164 | orchestrator | Saturday 27 September 2025 00:25:04 +0000 (0:00:00.575) 0:00:10.976 **** 2025-09-27 00:25:05.288175 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:25:05.288200 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:25:05.288212 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:25:05.288222 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:25:05.288233 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:25:05.288244 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:25:05.288255 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:05.288265 | orchestrator | 2025-09-27 00:25:05.288284 | orchestrator | TASK [osism.commons.resolvconf : Check minimum and maximum number of name servers] *** 2025-09-27 00:25:05.288296 | orchestrator | Saturday 27 September 2025 00:25:05 +0000 (0:00:00.406) 0:00:11.382 **** 2025-09-27 00:25:05.288307 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:05.288317 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:25:05.288335 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:25:18.811032 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:25:18.811132 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:25:18.811148 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:25:18.811160 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:25:18.811172 | orchestrator | 2025-09-27 00:25:18.811212 | orchestrator | TASK [osism.commons.resolvconf : Include resolvconf tasks] ********************* 2025-09-27 00:25:18.811226 | orchestrator | Saturday 27 September 2025 00:25:05 +0000 (0:00:00.226) 0:00:11.609 **** 2025-09-27 00:25:18.811239 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-resolv.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:18.811253 | orchestrator | 2025-09-27 00:25:18.811264 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific installation tasks] *** 2025-09-27 00:25:18.811295 | orchestrator | Saturday 27 September 2025 00:25:05 +0000 (0:00:00.289) 0:00:11.899 **** 2025-09-27 00:25:18.811307 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:18.811318 | orchestrator | 2025-09-27 00:25:18.811329 | orchestrator | TASK [osism.commons.resolvconf : Remove packages configuring /etc/resolv.conf] *** 2025-09-27 00:25:18.811340 | orchestrator | Saturday 27 September 2025 00:25:05 +0000 (0:00:00.315) 0:00:12.214 **** 2025-09-27 00:25:18.811351 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.811363 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:18.811374 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.811385 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:18.811407 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:18.811419 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.811430 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.811441 | orchestrator | 2025-09-27 00:25:18.811452 | orchestrator | TASK [osism.commons.resolvconf : Install package systemd-resolved] ************* 2025-09-27 00:25:18.811463 | orchestrator | Saturday 27 September 2025 00:25:07 +0000 (0:00:01.528) 0:00:13.743 **** 2025-09-27 00:25:18.811474 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:18.811485 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:25:18.811496 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:25:18.811507 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:25:18.811518 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:25:18.811528 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:25:18.811540 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:25:18.811552 | orchestrator | 2025-09-27 00:25:18.811565 | orchestrator | TASK [osism.commons.resolvconf : Retrieve file status of /etc/resolv.conf] ***** 2025-09-27 00:25:18.811577 | orchestrator | Saturday 27 September 2025 00:25:07 +0000 (0:00:00.278) 0:00:14.021 **** 2025-09-27 00:25:18.811590 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.811603 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.811615 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.811627 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.811639 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:18.811651 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:18.811663 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:18.811676 | orchestrator | 2025-09-27 00:25:18.811688 | orchestrator | TASK [osism.commons.resolvconf : Archive existing file /etc/resolv.conf] ******* 2025-09-27 00:25:18.811701 | orchestrator | Saturday 27 September 2025 00:25:08 +0000 (0:00:00.552) 0:00:14.574 **** 2025-09-27 00:25:18.811714 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:18.811727 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:25:18.811740 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:25:18.811752 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:25:18.811764 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:25:18.811777 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:25:18.811789 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:25:18.811802 | orchestrator | 2025-09-27 00:25:18.811815 | orchestrator | TASK [osism.commons.resolvconf : Link /run/systemd/resolve/stub-resolv.conf to /etc/resolv.conf] *** 2025-09-27 00:25:18.811828 | orchestrator | Saturday 27 September 2025 00:25:08 +0000 (0:00:00.293) 0:00:14.867 **** 2025-09-27 00:25:18.811841 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.811853 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:18.811866 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:18.811879 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:18.811891 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:18.811903 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:18.811914 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:18.811925 | orchestrator | 2025-09-27 00:25:18.811936 | orchestrator | TASK [osism.commons.resolvconf : Copy configuration files] ********************* 2025-09-27 00:25:18.811955 | orchestrator | Saturday 27 September 2025 00:25:09 +0000 (0:00:00.553) 0:00:15.421 **** 2025-09-27 00:25:18.811966 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.811977 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:18.811988 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:18.811998 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:18.812009 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:18.812020 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:18.812030 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:18.812041 | orchestrator | 2025-09-27 00:25:18.812052 | orchestrator | TASK [osism.commons.resolvconf : Start/enable systemd-resolved service] ******** 2025-09-27 00:25:18.812063 | orchestrator | Saturday 27 September 2025 00:25:10 +0000 (0:00:01.150) 0:00:16.572 **** 2025-09-27 00:25:18.812074 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.812085 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:18.812096 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:18.812107 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.812117 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:18.812128 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.812139 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.812150 | orchestrator | 2025-09-27 00:25:18.812161 | orchestrator | TASK [osism.commons.resolvconf : Include distribution specific configuration tasks] *** 2025-09-27 00:25:18.812172 | orchestrator | Saturday 27 September 2025 00:25:12 +0000 (0:00:02.049) 0:00:18.621 **** 2025-09-27 00:25:18.812216 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/resolvconf/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:18.812229 | orchestrator | 2025-09-27 00:25:18.812240 | orchestrator | TASK [osism.commons.resolvconf : Restart systemd-resolved service] ************* 2025-09-27 00:25:18.812251 | orchestrator | Saturday 27 September 2025 00:25:12 +0000 (0:00:00.355) 0:00:18.976 **** 2025-09-27 00:25:18.812262 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:18.812273 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:18.812284 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:18.812295 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:18.812305 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:18.812316 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:18.812327 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:18.812338 | orchestrator | 2025-09-27 00:25:18.812349 | orchestrator | TASK [osism.commons.repository : Gather variables for each operating system] *** 2025-09-27 00:25:18.812360 | orchestrator | Saturday 27 September 2025 00:25:13 +0000 (0:00:01.257) 0:00:20.234 **** 2025-09-27 00:25:18.812371 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.812381 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.812392 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.812403 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.812414 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:18.812425 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:18.812435 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:18.812446 | orchestrator | 2025-09-27 00:25:18.812457 | orchestrator | TASK [osism.commons.repository : Set repository_default fact to default value] *** 2025-09-27 00:25:18.812468 | orchestrator | Saturday 27 September 2025 00:25:14 +0000 (0:00:00.206) 0:00:20.441 **** 2025-09-27 00:25:18.812479 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.812489 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.812500 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.812515 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.812526 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:18.812537 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:18.812548 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:18.812559 | orchestrator | 2025-09-27 00:25:18.812570 | orchestrator | TASK [osism.commons.repository : Set repositories to default] ****************** 2025-09-27 00:25:18.812581 | orchestrator | Saturday 27 September 2025 00:25:14 +0000 (0:00:00.248) 0:00:20.689 **** 2025-09-27 00:25:18.812598 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.812609 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.812619 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.812630 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.812641 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:18.812651 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:18.812662 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:18.812675 | orchestrator | 2025-09-27 00:25:18.812694 | orchestrator | TASK [osism.commons.repository : Include distribution specific repository tasks] *** 2025-09-27 00:25:18.812713 | orchestrator | Saturday 27 September 2025 00:25:14 +0000 (0:00:00.272) 0:00:20.962 **** 2025-09-27 00:25:18.812732 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/repository/tasks/Ubuntu.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:18.812752 | orchestrator | 2025-09-27 00:25:18.812771 | orchestrator | TASK [osism.commons.repository : Create /etc/apt/sources.list.d directory] ***** 2025-09-27 00:25:18.812788 | orchestrator | Saturday 27 September 2025 00:25:15 +0000 (0:00:00.331) 0:00:21.293 **** 2025-09-27 00:25:18.812806 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.812824 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.812843 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.812862 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:18.812879 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:18.812898 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:18.812917 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.812936 | orchestrator | 2025-09-27 00:25:18.812949 | orchestrator | TASK [osism.commons.repository : Include tasks for Ubuntu < 24.04] ************* 2025-09-27 00:25:18.812960 | orchestrator | Saturday 27 September 2025 00:25:15 +0000 (0:00:00.567) 0:00:21.860 **** 2025-09-27 00:25:18.812971 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:18.812981 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:25:18.812992 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:25:18.813003 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:25:18.813013 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:25:18.813024 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:25:18.813034 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:25:18.813045 | orchestrator | 2025-09-27 00:25:18.813056 | orchestrator | TASK [osism.commons.repository : Copy 99osism apt configuration] *************** 2025-09-27 00:25:18.813066 | orchestrator | Saturday 27 September 2025 00:25:15 +0000 (0:00:00.253) 0:00:22.114 **** 2025-09-27 00:25:18.813077 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.813087 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.813098 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.813109 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:18.813119 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:18.813130 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.813141 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:18.813151 | orchestrator | 2025-09-27 00:25:18.813162 | orchestrator | TASK [osism.commons.repository : Remove sources.list file] ********************* 2025-09-27 00:25:18.813173 | orchestrator | Saturday 27 September 2025 00:25:17 +0000 (0:00:01.144) 0:00:23.258 **** 2025-09-27 00:25:18.813205 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.813221 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.813232 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.813243 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:18.813254 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.813264 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:18.813275 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:18.813286 | orchestrator | 2025-09-27 00:25:18.813297 | orchestrator | TASK [osism.commons.repository : Copy ubuntu.sources file] ********************* 2025-09-27 00:25:18.813308 | orchestrator | Saturday 27 September 2025 00:25:17 +0000 (0:00:00.608) 0:00:23.866 **** 2025-09-27 00:25:18.813318 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:18.813338 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:18.813349 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:18.813359 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:18.813380 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:59.772136 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:59.772339 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:59.772360 | orchestrator | 2025-09-27 00:25:59.772385 | orchestrator | TASK [osism.commons.repository : Update package cache] ************************* 2025-09-27 00:25:59.772408 | orchestrator | Saturday 27 September 2025 00:25:18 +0000 (0:00:01.173) 0:00:25.040 **** 2025-09-27 00:25:59.772428 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.772447 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.772464 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.772482 | orchestrator | changed: [testbed-manager] 2025-09-27 00:25:59.772500 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:59.772520 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:59.772538 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:59.772557 | orchestrator | 2025-09-27 00:25:59.772569 | orchestrator | TASK [osism.services.rsyslog : Gather variables for each operating system] ***** 2025-09-27 00:25:59.772581 | orchestrator | Saturday 27 September 2025 00:25:35 +0000 (0:00:16.807) 0:00:41.848 **** 2025-09-27 00:25:59.772592 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.772603 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.772614 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.772624 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.772635 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.772646 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.772656 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.772669 | orchestrator | 2025-09-27 00:25:59.772682 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_user variable to default value] ***** 2025-09-27 00:25:59.772694 | orchestrator | Saturday 27 September 2025 00:25:35 +0000 (0:00:00.232) 0:00:42.080 **** 2025-09-27 00:25:59.772707 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.772719 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.772732 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.772744 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.772756 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.772768 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.772781 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.772793 | orchestrator | 2025-09-27 00:25:59.772806 | orchestrator | TASK [osism.services.rsyslog : Set rsyslog_workdir variable to default value] *** 2025-09-27 00:25:59.772818 | orchestrator | Saturday 27 September 2025 00:25:36 +0000 (0:00:00.234) 0:00:42.314 **** 2025-09-27 00:25:59.772830 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.772842 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.772855 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.772867 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.772879 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.772891 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.772903 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.772914 | orchestrator | 2025-09-27 00:25:59.772927 | orchestrator | TASK [osism.services.rsyslog : Include distribution specific install tasks] **** 2025-09-27 00:25:59.772939 | orchestrator | Saturday 27 September 2025 00:25:36 +0000 (0:00:00.222) 0:00:42.537 **** 2025-09-27 00:25:59.772954 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:59.772969 | orchestrator | 2025-09-27 00:25:59.772982 | orchestrator | TASK [osism.services.rsyslog : Install rsyslog package] ************************ 2025-09-27 00:25:59.772994 | orchestrator | Saturday 27 September 2025 00:25:36 +0000 (0:00:00.281) 0:00:42.818 **** 2025-09-27 00:25:59.773007 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.773020 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.773031 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.773042 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.773083 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.773095 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.773105 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.773116 | orchestrator | 2025-09-27 00:25:59.773127 | orchestrator | TASK [osism.services.rsyslog : Copy rsyslog.conf configuration file] *********** 2025-09-27 00:25:59.773137 | orchestrator | Saturday 27 September 2025 00:25:38 +0000 (0:00:01.556) 0:00:44.375 **** 2025-09-27 00:25:59.773148 | orchestrator | changed: [testbed-manager] 2025-09-27 00:25:59.773159 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:59.773169 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:59.773180 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:59.773213 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:59.773224 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:59.773234 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:59.773245 | orchestrator | 2025-09-27 00:25:59.773256 | orchestrator | TASK [osism.services.rsyslog : Manage rsyslog service] ************************* 2025-09-27 00:25:59.773266 | orchestrator | Saturday 27 September 2025 00:25:39 +0000 (0:00:01.195) 0:00:45.570 **** 2025-09-27 00:25:59.773277 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.773288 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.773298 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.773309 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.773319 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.773330 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.773357 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.773368 | orchestrator | 2025-09-27 00:25:59.773379 | orchestrator | TASK [osism.services.rsyslog : Include fluentd tasks] ************************** 2025-09-27 00:25:59.773390 | orchestrator | Saturday 27 September 2025 00:25:40 +0000 (0:00:00.864) 0:00:46.435 **** 2025-09-27 00:25:59.773402 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rsyslog/tasks/fluentd.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:59.773414 | orchestrator | 2025-09-27 00:25:59.773425 | orchestrator | TASK [osism.services.rsyslog : Forward syslog message to local fluentd daemon] *** 2025-09-27 00:25:59.773436 | orchestrator | Saturday 27 September 2025 00:25:40 +0000 (0:00:00.306) 0:00:46.741 **** 2025-09-27 00:25:59.773447 | orchestrator | changed: [testbed-manager] 2025-09-27 00:25:59.773457 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:59.773468 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:59.773479 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:59.773489 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:59.773500 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:59.773510 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:59.773521 | orchestrator | 2025-09-27 00:25:59.773550 | orchestrator | TASK [osism.services.rsyslog : Include additional log server tasks] ************ 2025-09-27 00:25:59.773561 | orchestrator | Saturday 27 September 2025 00:25:41 +0000 (0:00:01.185) 0:00:47.927 **** 2025-09-27 00:25:59.773572 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:25:59.773583 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:25:59.773594 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:25:59.773605 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:25:59.773615 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:25:59.773626 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:25:59.773636 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:25:59.773646 | orchestrator | 2025-09-27 00:25:59.773657 | orchestrator | TASK [osism.commons.systohc : Install util-linux-extra package] **************** 2025-09-27 00:25:59.773668 | orchestrator | Saturday 27 September 2025 00:25:42 +0000 (0:00:00.330) 0:00:48.257 **** 2025-09-27 00:25:59.773679 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:59.773690 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:59.773700 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:59.773711 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:59.773731 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:59.773741 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:59.773752 | orchestrator | changed: [testbed-manager] 2025-09-27 00:25:59.773763 | orchestrator | 2025-09-27 00:25:59.773773 | orchestrator | TASK [osism.commons.systohc : Sync hardware clock] ***************************** 2025-09-27 00:25:59.773784 | orchestrator | Saturday 27 September 2025 00:25:54 +0000 (0:00:12.455) 0:01:00.713 **** 2025-09-27 00:25:59.773795 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.773805 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.773816 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.773827 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.773838 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.773848 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.773859 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.773869 | orchestrator | 2025-09-27 00:25:59.773885 | orchestrator | TASK [osism.commons.configfs : Start sys-kernel-config mount] ****************** 2025-09-27 00:25:59.773896 | orchestrator | Saturday 27 September 2025 00:25:55 +0000 (0:00:01.186) 0:01:01.899 **** 2025-09-27 00:25:59.773907 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.773917 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.773928 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.773939 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.773949 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.773959 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.773970 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.773980 | orchestrator | 2025-09-27 00:25:59.773991 | orchestrator | TASK [osism.commons.packages : Gather variables for each operating system] ***** 2025-09-27 00:25:59.774002 | orchestrator | Saturday 27 September 2025 00:25:56 +0000 (0:00:00.881) 0:01:02.781 **** 2025-09-27 00:25:59.774012 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.774080 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.774091 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.774101 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.774112 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.774122 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.774133 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.774143 | orchestrator | 2025-09-27 00:25:59.774154 | orchestrator | TASK [osism.commons.packages : Set required_packages_distribution variable to default value] *** 2025-09-27 00:25:59.774165 | orchestrator | Saturday 27 September 2025 00:25:56 +0000 (0:00:00.220) 0:01:03.001 **** 2025-09-27 00:25:59.774176 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.774203 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.774213 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.774224 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.774234 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.774245 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.774255 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.774266 | orchestrator | 2025-09-27 00:25:59.774277 | orchestrator | TASK [osism.commons.packages : Include distribution specific package tasks] **** 2025-09-27 00:25:59.774287 | orchestrator | Saturday 27 September 2025 00:25:57 +0000 (0:00:00.248) 0:01:03.250 **** 2025-09-27 00:25:59.774299 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/packages/tasks/package-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:25:59.774310 | orchestrator | 2025-09-27 00:25:59.774321 | orchestrator | TASK [osism.commons.packages : Install needrestart package] ******************** 2025-09-27 00:25:59.774331 | orchestrator | Saturday 27 September 2025 00:25:57 +0000 (0:00:00.326) 0:01:03.576 **** 2025-09-27 00:25:59.774342 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.774352 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.774363 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.774374 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.774384 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.774395 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.774405 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.774423 | orchestrator | 2025-09-27 00:25:59.774434 | orchestrator | TASK [osism.commons.packages : Set needrestart mode] *************************** 2025-09-27 00:25:59.774445 | orchestrator | Saturday 27 September 2025 00:25:58 +0000 (0:00:01.651) 0:01:05.227 **** 2025-09-27 00:25:59.774456 | orchestrator | changed: [testbed-manager] 2025-09-27 00:25:59.774467 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:25:59.774477 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:25:59.774495 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:25:59.774513 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:25:59.774544 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:25:59.774562 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:25:59.774580 | orchestrator | 2025-09-27 00:25:59.774597 | orchestrator | TASK [osism.commons.packages : Set apt_cache_valid_time variable to default value] *** 2025-09-27 00:25:59.774614 | orchestrator | Saturday 27 September 2025 00:25:59 +0000 (0:00:00.539) 0:01:05.766 **** 2025-09-27 00:25:59.774629 | orchestrator | ok: [testbed-manager] 2025-09-27 00:25:59.774643 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:25:59.774658 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:25:59.774674 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:25:59.774689 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:25:59.774704 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:25:59.774720 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:25:59.774735 | orchestrator | 2025-09-27 00:25:59.774764 | orchestrator | TASK [osism.commons.packages : Update package cache] *************************** 2025-09-27 00:28:12.472007 | orchestrator | Saturday 27 September 2025 00:25:59 +0000 (0:00:00.234) 0:01:06.001 **** 2025-09-27 00:28:12.472164 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:12.472183 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:12.472229 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:12.472241 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:12.472252 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:12.472263 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:12.472274 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:12.472285 | orchestrator | 2025-09-27 00:28:12.472297 | orchestrator | TASK [osism.commons.packages : Download upgrade packages] ********************** 2025-09-27 00:28:12.472308 | orchestrator | Saturday 27 September 2025 00:26:00 +0000 (0:00:01.151) 0:01:07.153 **** 2025-09-27 00:28:12.472319 | orchestrator | changed: [testbed-manager] 2025-09-27 00:28:12.472331 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:28:12.472342 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:28:12.472352 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:28:12.472363 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:28:12.472374 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:28:12.472384 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:28:12.472395 | orchestrator | 2025-09-27 00:28:12.472406 | orchestrator | TASK [osism.commons.packages : Upgrade packages] ******************************* 2025-09-27 00:28:12.472418 | orchestrator | Saturday 27 September 2025 00:26:02 +0000 (0:00:01.881) 0:01:09.035 **** 2025-09-27 00:28:12.472428 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:12.472439 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:12.472449 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:12.472460 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:12.472471 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:12.472481 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:12.472492 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:12.472502 | orchestrator | 2025-09-27 00:28:12.472513 | orchestrator | TASK [osism.commons.packages : Download required packages] ********************* 2025-09-27 00:28:12.472544 | orchestrator | Saturday 27 September 2025 00:26:05 +0000 (0:00:02.472) 0:01:11.508 **** 2025-09-27 00:28:12.472555 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:12.472566 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:12.472576 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:12.472587 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:12.472597 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:12.472608 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:12.472645 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:12.472656 | orchestrator | 2025-09-27 00:28:12.472667 | orchestrator | TASK [osism.commons.packages : Install required packages] ********************** 2025-09-27 00:28:12.472678 | orchestrator | Saturday 27 September 2025 00:26:44 +0000 (0:00:39.046) 0:01:50.555 **** 2025-09-27 00:28:12.472688 | orchestrator | changed: [testbed-manager] 2025-09-27 00:28:12.472699 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:28:12.472709 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:28:12.472719 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:28:12.472730 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:28:12.472740 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:28:12.472751 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:28:12.472761 | orchestrator | 2025-09-27 00:28:12.472772 | orchestrator | TASK [osism.commons.packages : Remove useless packages from the cache] ********* 2025-09-27 00:28:12.472783 | orchestrator | Saturday 27 September 2025 00:27:57 +0000 (0:01:13.356) 0:03:03.911 **** 2025-09-27 00:28:12.472793 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:12.472804 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:12.472814 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:12.472825 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:12.472835 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:12.472846 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:12.472856 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:12.472866 | orchestrator | 2025-09-27 00:28:12.472877 | orchestrator | TASK [osism.commons.packages : Remove dependencies that are no longer required] *** 2025-09-27 00:28:12.472888 | orchestrator | Saturday 27 September 2025 00:27:59 +0000 (0:00:01.705) 0:03:05.616 **** 2025-09-27 00:28:12.472899 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:12.472909 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:12.472920 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:12.472930 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:12.472940 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:12.472950 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:12.472961 | orchestrator | changed: [testbed-manager] 2025-09-27 00:28:12.472971 | orchestrator | 2025-09-27 00:28:12.472982 | orchestrator | TASK [osism.commons.sysctl : Include sysctl tasks] ***************************** 2025-09-27 00:28:12.472992 | orchestrator | Saturday 27 September 2025 00:28:11 +0000 (0:00:11.854) 0:03:17.470 **** 2025-09-27 00:28:12.473006 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'elasticsearch', 'value': [{'name': 'vm.max_map_count', 'value': 262144}]}) 2025-09-27 00:28:12.473035 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'rabbitmq', 'value': [{'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}, {'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}, {'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}, {'name': 'net.core.wmem_max', 'value': 16777216}, {'name': 'net.core.rmem_max', 'value': 16777216}, {'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}, {'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}, {'name': 'net.core.somaxconn', 'value': 4096}, {'name': 'net.ipv4.tcp_syncookies', 'value': 0}, {'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}]}) 2025-09-27 00:28:12.473076 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'generic', 'value': [{'name': 'vm.swappiness', 'value': 1}]}) 2025-09-27 00:28:12.473090 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'compute', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2025-09-27 00:28:12.473110 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'network', 'value': [{'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}]}) 2025-09-27 00:28:12.473121 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/sysctl/tasks/sysctl.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 => (item={'key': 'k3s_node', 'value': [{'name': 'fs.inotify.max_user_instances', 'value': 1024}]}) 2025-09-27 00:28:12.473132 | orchestrator | 2025-09-27 00:28:12.473143 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on elasticsearch] *********** 2025-09-27 00:28:12.473154 | orchestrator | Saturday 27 September 2025 00:28:11 +0000 (0:00:00.360) 0:03:17.831 **** 2025-09-27 00:28:12.473166 | orchestrator | skipping: [testbed-manager] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-09-27 00:28:12.473177 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:28:12.473188 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-09-27 00:28:12.473220 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-09-27 00:28:12.473232 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:28:12.473243 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'vm.max_map_count', 'value': 262144})  2025-09-27 00:28:12.473254 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:28:12.473264 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:28:12.473276 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-27 00:28:12.473287 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-27 00:28:12.473297 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-27 00:28:12.473308 | orchestrator | 2025-09-27 00:28:12.473319 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on rabbitmq] **************** 2025-09-27 00:28:12.473329 | orchestrator | Saturday 27 September 2025 00:28:12 +0000 (0:00:00.748) 0:03:18.579 **** 2025-09-27 00:28:12.473340 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-09-27 00:28:12.473352 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-09-27 00:28:12.473363 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-09-27 00:28:12.473374 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-09-27 00:28:12.473384 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-09-27 00:28:12.473395 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-09-27 00:28:12.473406 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-09-27 00:28:12.473416 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-09-27 00:28:12.473427 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-09-27 00:28:12.473438 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-09-27 00:28:12.473448 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-09-27 00:28:12.473459 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-09-27 00:28:12.473477 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-09-27 00:28:12.473488 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-09-27 00:28:12.473499 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-09-27 00:28:12.473509 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-09-27 00:28:12.473520 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-09-27 00:28:12.473539 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:28:12.473557 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-09-27 00:28:20.956188 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-09-27 00:28:20.956356 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-09-27 00:28:20.956379 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-09-27 00:28:20.956397 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-09-27 00:28:20.956416 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-09-27 00:28:20.956434 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-09-27 00:28:20.956452 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-09-27 00:28:20.956469 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-09-27 00:28:20.956487 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-09-27 00:28:20.956502 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:28:20.956521 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-09-27 00:28:20.956540 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6})  2025-09-27 00:28:20.956574 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3})  2025-09-27 00:28:20.956587 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-09-27 00:28:20.956604 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3})  2025-09-27 00:28:20.956620 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-09-27 00:28:20.956634 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.wmem_max', 'value': 16777216})  2025-09-27 00:28:20.956651 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.rmem_max', 'value': 16777216})  2025-09-27 00:28:20.956662 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20})  2025-09-27 00:28:20.956680 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1})  2025-09-27 00:28:20.956696 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:28:20.956714 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.core.somaxconn', 'value': 4096})  2025-09-27 00:28:20.956730 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0})  2025-09-27 00:28:20.956746 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192})  2025-09-27 00:28:20.956761 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:28:20.956778 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-09-27 00:28:20.956793 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-09-27 00:28:20.956837 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_time', 'value': 6}) 2025-09-27 00:28:20.956853 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-09-27 00:28:20.956868 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-09-27 00:28:20.956886 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_intvl', 'value': 3}) 2025-09-27 00:28:20.956901 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-09-27 00:28:20.956918 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-09-27 00:28:20.956933 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-09-27 00:28:20.956947 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-09-27 00:28:20.956961 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-09-27 00:28:20.956976 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-09-27 00:28:20.956990 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_keepalive_probes', 'value': 3}) 2025-09-27 00:28:20.957004 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-09-27 00:28:20.957019 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-09-27 00:28:20.957034 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.wmem_max', 'value': 16777216}) 2025-09-27 00:28:20.957048 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-09-27 00:28:20.957064 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-09-27 00:28:20.957109 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-09-27 00:28:20.957125 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-09-27 00:28:20.957141 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.rmem_max', 'value': 16777216}) 2025-09-27 00:28:20.957156 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-09-27 00:28:20.957170 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-09-27 00:28:20.957185 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-09-27 00:28:20.957222 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-09-27 00:28:20.957237 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_fin_timeout', 'value': 20}) 2025-09-27 00:28:20.957250 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_tw_reuse', 'value': 1}) 2025-09-27 00:28:20.957264 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.core.somaxconn', 'value': 4096}) 2025-09-27 00:28:20.957280 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_syncookies', 'value': 0}) 2025-09-27 00:28:20.957297 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_max_syn_backlog', 'value': 8192}) 2025-09-27 00:28:20.957314 | orchestrator | 2025-09-27 00:28:20.957342 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on generic] ***************** 2025-09-27 00:28:20.957360 | orchestrator | Saturday 27 September 2025 00:28:18 +0000 (0:00:05.909) 0:03:24.489 **** 2025-09-27 00:28:20.957370 | orchestrator | changed: [testbed-manager] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-27 00:28:20.957379 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-27 00:28:20.957389 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-27 00:28:20.957407 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-27 00:28:20.957418 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-27 00:28:20.957434 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-27 00:28:20.957450 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 1}) 2025-09-27 00:28:20.957467 | orchestrator | 2025-09-27 00:28:20.957484 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on compute] ***************** 2025-09-27 00:28:20.957499 | orchestrator | Saturday 27 September 2025 00:28:19 +0000 (0:00:01.577) 0:03:26.066 **** 2025-09-27 00:28:20.957515 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-27 00:28:20.957532 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:28:20.957548 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-27 00:28:20.957558 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-27 00:28:20.957567 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:28:20.957577 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:28:20.957587 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-27 00:28:20.957596 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:28:20.957609 | orchestrator | changed: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-27 00:28:20.957625 | orchestrator | changed: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-27 00:28:20.957641 | orchestrator | changed: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-27 00:28:20.957657 | orchestrator | 2025-09-27 00:28:20.957671 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on network] ***************** 2025-09-27 00:28:20.957681 | orchestrator | Saturday 27 September 2025 00:28:20 +0000 (0:00:00.524) 0:03:26.591 **** 2025-09-27 00:28:20.957690 | orchestrator | skipping: [testbed-manager] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-27 00:28:20.957700 | orchestrator | skipping: [testbed-node-3] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-27 00:28:20.957709 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:28:20.957718 | orchestrator | skipping: [testbed-node-4] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-27 00:28:20.957728 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:28:20.957739 | orchestrator | skipping: [testbed-node-5] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576})  2025-09-27 00:28:20.957755 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:28:20.957773 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:28:20.957783 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-27 00:28:20.957792 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-27 00:28:20.957802 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.netfilter.nf_conntrack_max', 'value': 1048576}) 2025-09-27 00:28:20.957811 | orchestrator | 2025-09-27 00:28:20.957829 | orchestrator | TASK [osism.commons.sysctl : Set sysctl parameters on k3s_node] **************** 2025-09-27 00:28:33.724536 | orchestrator | Saturday 27 September 2025 00:28:20 +0000 (0:00:00.602) 0:03:27.194 **** 2025-09-27 00:28:33.724685 | orchestrator | skipping: [testbed-manager] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-09-27 00:28:33.724703 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:28:33.724716 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-09-27 00:28:33.724727 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-09-27 00:28:33.724764 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:28:33.724776 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:28:33.724787 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024})  2025-09-27 00:28:33.724798 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:28:33.724809 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-09-27 00:28:33.724820 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-09-27 00:28:33.724831 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.inotify.max_user_instances', 'value': 1024}) 2025-09-27 00:28:33.724842 | orchestrator | 2025-09-27 00:28:33.724854 | orchestrator | TASK [osism.commons.limits : Include limits tasks] ***************************** 2025-09-27 00:28:33.724886 | orchestrator | Saturday 27 September 2025 00:28:21 +0000 (0:00:00.605) 0:03:27.799 **** 2025-09-27 00:28:33.724897 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:28:33.724908 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:28:33.724919 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:28:33.724930 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:28:33.724940 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:28:33.724951 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:28:33.724961 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:28:33.724972 | orchestrator | 2025-09-27 00:28:33.724983 | orchestrator | TASK [osism.commons.services : Populate service facts] ************************* 2025-09-27 00:28:33.724994 | orchestrator | Saturday 27 September 2025 00:28:21 +0000 (0:00:00.308) 0:03:28.107 **** 2025-09-27 00:28:33.725004 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:33.725017 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:33.725027 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:33.725038 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:33.725048 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:33.725059 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:33.725070 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:33.725081 | orchestrator | 2025-09-27 00:28:33.725092 | orchestrator | TASK [osism.commons.services : Check services] ********************************* 2025-09-27 00:28:33.725103 | orchestrator | Saturday 27 September 2025 00:28:27 +0000 (0:00:05.626) 0:03:33.734 **** 2025-09-27 00:28:33.725114 | orchestrator | skipping: [testbed-manager] => (item=nscd)  2025-09-27 00:28:33.725125 | orchestrator | skipping: [testbed-node-3] => (item=nscd)  2025-09-27 00:28:33.725136 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:28:33.725147 | orchestrator | skipping: [testbed-node-4] => (item=nscd)  2025-09-27 00:28:33.725157 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:28:33.725168 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:28:33.725178 | orchestrator | skipping: [testbed-node-5] => (item=nscd)  2025-09-27 00:28:33.725189 | orchestrator | skipping: [testbed-node-0] => (item=nscd)  2025-09-27 00:28:33.725225 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:28:33.725236 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:28:33.725247 | orchestrator | skipping: [testbed-node-1] => (item=nscd)  2025-09-27 00:28:33.725257 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:28:33.725268 | orchestrator | skipping: [testbed-node-2] => (item=nscd)  2025-09-27 00:28:33.725278 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:28:33.725289 | orchestrator | 2025-09-27 00:28:33.725300 | orchestrator | TASK [osism.commons.services : Start/enable required services] ***************** 2025-09-27 00:28:33.725311 | orchestrator | Saturday 27 September 2025 00:28:27 +0000 (0:00:00.306) 0:03:34.040 **** 2025-09-27 00:28:33.725321 | orchestrator | ok: [testbed-manager] => (item=cron) 2025-09-27 00:28:33.725332 | orchestrator | ok: [testbed-node-4] => (item=cron) 2025-09-27 00:28:33.725343 | orchestrator | ok: [testbed-node-3] => (item=cron) 2025-09-27 00:28:33.725353 | orchestrator | ok: [testbed-node-5] => (item=cron) 2025-09-27 00:28:33.725373 | orchestrator | ok: [testbed-node-0] => (item=cron) 2025-09-27 00:28:33.725383 | orchestrator | ok: [testbed-node-1] => (item=cron) 2025-09-27 00:28:33.725394 | orchestrator | ok: [testbed-node-2] => (item=cron) 2025-09-27 00:28:33.725405 | orchestrator | 2025-09-27 00:28:33.725416 | orchestrator | TASK [osism.commons.motd : Include distribution specific configure tasks] ****** 2025-09-27 00:28:33.725427 | orchestrator | Saturday 27 September 2025 00:28:28 +0000 (0:00:01.060) 0:03:35.100 **** 2025-09-27 00:28:33.725440 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/motd/tasks/configure-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:28:33.725454 | orchestrator | 2025-09-27 00:28:33.725465 | orchestrator | TASK [osism.commons.motd : Remove update-motd package] ************************* 2025-09-27 00:28:33.725476 | orchestrator | Saturday 27 September 2025 00:28:29 +0000 (0:00:00.514) 0:03:35.615 **** 2025-09-27 00:28:33.725487 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:33.725498 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:33.725508 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:33.725519 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:33.725530 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:33.725540 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:33.725551 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:33.725561 | orchestrator | 2025-09-27 00:28:33.725572 | orchestrator | TASK [osism.commons.motd : Check if /etc/default/motd-news exists] ************* 2025-09-27 00:28:33.725583 | orchestrator | Saturday 27 September 2025 00:28:30 +0000 (0:00:01.371) 0:03:36.986 **** 2025-09-27 00:28:33.725594 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:33.725624 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:33.725637 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:33.725647 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:33.725658 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:33.725668 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:33.725679 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:33.725689 | orchestrator | 2025-09-27 00:28:33.725700 | orchestrator | TASK [osism.commons.motd : Disable the dynamic motd-news service] ************** 2025-09-27 00:28:33.725711 | orchestrator | Saturday 27 September 2025 00:28:31 +0000 (0:00:00.619) 0:03:37.606 **** 2025-09-27 00:28:33.725722 | orchestrator | changed: [testbed-manager] 2025-09-27 00:28:33.725732 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:28:33.725743 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:28:33.725754 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:28:33.725764 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:28:33.725775 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:28:33.725785 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:28:33.725796 | orchestrator | 2025-09-27 00:28:33.725807 | orchestrator | TASK [osism.commons.motd : Get all configuration files in /etc/pam.d] ********** 2025-09-27 00:28:33.725817 | orchestrator | Saturday 27 September 2025 00:28:32 +0000 (0:00:00.653) 0:03:38.259 **** 2025-09-27 00:28:33.725828 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:33.725839 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:33.725849 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:33.725860 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:33.725870 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:33.725881 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:33.725891 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:33.725902 | orchestrator | 2025-09-27 00:28:33.725913 | orchestrator | TASK [osism.commons.motd : Remove pam_motd.so rule] **************************** 2025-09-27 00:28:33.725924 | orchestrator | Saturday 27 September 2025 00:28:32 +0000 (0:00:00.652) 0:03:38.912 **** 2025-09-27 00:28:33.725940 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758931443.1401956, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:33.725962 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758931477.7041767, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:33.725981 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758931482.6504092, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:33.725993 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758931470.708282, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:33.726005 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758931474.9564598, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:33.726112 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758931478.8632596, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340610 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/sshd', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2133, 'inode': 567, 'dev': 2049, 'nlink': 1, 'atime': 1758931485.3246255, 'mtime': 1740432309.0, 'ctime': 1743685035.2598536, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340792 | orchestrator | changed: [testbed-manager] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340834 | orchestrator | changed: [testbed-node-5] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340848 | orchestrator | changed: [testbed-node-3] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340860 | orchestrator | changed: [testbed-node-0] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340872 | orchestrator | changed: [testbed-node-4] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340883 | orchestrator | changed: [testbed-node-1] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340934 | orchestrator | changed: [testbed-node-2] => (item={'path': '/etc/pam.d/login', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 4118, 'inode': 554, 'dev': 2049, 'nlink': 1, 'atime': 1743684808.8363404, 'mtime': 1712646062.0, 'ctime': 1743685035.2588537, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:28:49.340947 | orchestrator | 2025-09-27 00:28:49.340961 | orchestrator | TASK [osism.commons.motd : Copy motd file] ************************************* 2025-09-27 00:28:49.340974 | orchestrator | Saturday 27 September 2025 00:28:33 +0000 (0:00:01.040) 0:03:39.953 **** 2025-09-27 00:28:49.340986 | orchestrator | changed: [testbed-manager] 2025-09-27 00:28:49.341008 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:28:49.341019 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:28:49.341030 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:28:49.341045 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:28:49.341056 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:28:49.341067 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:28:49.341078 | orchestrator | 2025-09-27 00:28:49.341090 | orchestrator | TASK [osism.commons.motd : Copy issue file] ************************************ 2025-09-27 00:28:49.341103 | orchestrator | Saturday 27 September 2025 00:28:34 +0000 (0:00:01.073) 0:03:41.027 **** 2025-09-27 00:28:49.341116 | orchestrator | changed: [testbed-manager] 2025-09-27 00:28:49.341128 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:28:49.341140 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:28:49.341153 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:28:49.341165 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:28:49.341178 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:28:49.341190 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:28:49.341228 | orchestrator | 2025-09-27 00:28:49.341241 | orchestrator | TASK [osism.commons.motd : Copy issue.net file] ******************************** 2025-09-27 00:28:49.341254 | orchestrator | Saturday 27 September 2025 00:28:35 +0000 (0:00:01.162) 0:03:42.189 **** 2025-09-27 00:28:49.341266 | orchestrator | changed: [testbed-manager] 2025-09-27 00:28:49.341278 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:28:49.341290 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:28:49.341302 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:28:49.341315 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:28:49.341326 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:28:49.341339 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:28:49.341351 | orchestrator | 2025-09-27 00:28:49.341363 | orchestrator | TASK [osism.commons.motd : Configure SSH to print the motd] ******************** 2025-09-27 00:28:49.341375 | orchestrator | Saturday 27 September 2025 00:28:37 +0000 (0:00:01.191) 0:03:43.381 **** 2025-09-27 00:28:49.341387 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:28:49.341400 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:28:49.341412 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:28:49.341426 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:28:49.341439 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:28:49.341450 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:28:49.341461 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:28:49.341471 | orchestrator | 2025-09-27 00:28:49.341482 | orchestrator | TASK [osism.commons.motd : Configure SSH to not print the motd] **************** 2025-09-27 00:28:49.341493 | orchestrator | Saturday 27 September 2025 00:28:37 +0000 (0:00:00.310) 0:03:43.691 **** 2025-09-27 00:28:49.341504 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:49.341516 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:49.341527 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:49.341537 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:49.341548 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:49.341558 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:49.341569 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:49.341580 | orchestrator | 2025-09-27 00:28:49.341590 | orchestrator | TASK [osism.services.rng : Include distribution specific install tasks] ******** 2025-09-27 00:28:49.341601 | orchestrator | Saturday 27 September 2025 00:28:38 +0000 (0:00:00.758) 0:03:44.450 **** 2025-09-27 00:28:49.341614 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/rng/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:28:49.341627 | orchestrator | 2025-09-27 00:28:49.341638 | orchestrator | TASK [osism.services.rng : Install rng package] ******************************** 2025-09-27 00:28:49.341649 | orchestrator | Saturday 27 September 2025 00:28:38 +0000 (0:00:00.449) 0:03:44.899 **** 2025-09-27 00:28:49.341660 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:49.341671 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:28:49.341688 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:28:49.341699 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:28:49.341710 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:28:49.341721 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:28:49.341731 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:28:49.341742 | orchestrator | 2025-09-27 00:28:49.341753 | orchestrator | TASK [osism.services.rng : Remove haveged package] ***************************** 2025-09-27 00:28:49.341764 | orchestrator | Saturday 27 September 2025 00:28:46 +0000 (0:00:08.343) 0:03:53.242 **** 2025-09-27 00:28:49.341774 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:49.341785 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:49.341796 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:49.341807 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:49.341817 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:49.341828 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:49.341838 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:49.341849 | orchestrator | 2025-09-27 00:28:49.341860 | orchestrator | TASK [osism.services.rng : Manage rng service] ********************************* 2025-09-27 00:28:49.341871 | orchestrator | Saturday 27 September 2025 00:28:48 +0000 (0:00:01.312) 0:03:54.555 **** 2025-09-27 00:28:49.341881 | orchestrator | ok: [testbed-manager] 2025-09-27 00:28:49.341892 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:28:49.341902 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:28:49.341913 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:28:49.341924 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:28:49.341934 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:28:49.341945 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:28:49.341956 | orchestrator | 2025-09-27 00:28:49.341973 | orchestrator | TASK [osism.commons.cleanup : Gather variables for each operating system] ****** 2025-09-27 00:29:58.215339 | orchestrator | Saturday 27 September 2025 00:28:49 +0000 (0:00:01.010) 0:03:55.565 **** 2025-09-27 00:29:58.215459 | orchestrator | ok: [testbed-manager] 2025-09-27 00:29:58.215477 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:29:58.215489 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:29:58.215500 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:29:58.215511 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:29:58.215522 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:29:58.215533 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:29:58.215544 | orchestrator | 2025-09-27 00:29:58.215557 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_packages_distribution variable to default value] *** 2025-09-27 00:29:58.215569 | orchestrator | Saturday 27 September 2025 00:28:49 +0000 (0:00:00.293) 0:03:55.859 **** 2025-09-27 00:29:58.215580 | orchestrator | ok: [testbed-manager] 2025-09-27 00:29:58.215608 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:29:58.215629 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:29:58.215668 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:29:58.215693 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:29:58.215731 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:29:58.215749 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:29:58.215766 | orchestrator | 2025-09-27 00:29:58.215784 | orchestrator | TASK [osism.commons.cleanup : Set cleanup_services_distribution variable to default value] *** 2025-09-27 00:29:58.215802 | orchestrator | Saturday 27 September 2025 00:28:49 +0000 (0:00:00.378) 0:03:56.237 **** 2025-09-27 00:29:58.215820 | orchestrator | ok: [testbed-manager] 2025-09-27 00:29:58.215839 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:29:58.215861 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:29:58.215880 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:29:58.215900 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:29:58.215913 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:29:58.215925 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:29:58.215938 | orchestrator | 2025-09-27 00:29:58.215950 | orchestrator | TASK [osism.commons.cleanup : Populate service facts] ************************** 2025-09-27 00:29:58.215964 | orchestrator | Saturday 27 September 2025 00:28:50 +0000 (0:00:00.318) 0:03:56.556 **** 2025-09-27 00:29:58.215977 | orchestrator | ok: [testbed-manager] 2025-09-27 00:29:58.216013 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:29:58.216026 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:29:58.216038 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:29:58.216051 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:29:58.216063 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:29:58.216073 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:29:58.216084 | orchestrator | 2025-09-27 00:29:58.216095 | orchestrator | TASK [osism.commons.cleanup : Include distribution specific timer tasks] ******* 2025-09-27 00:29:58.216106 | orchestrator | Saturday 27 September 2025 00:28:55 +0000 (0:00:05.640) 0:04:02.196 **** 2025-09-27 00:29:58.216119 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/timers-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:29:58.216133 | orchestrator | 2025-09-27 00:29:58.216144 | orchestrator | TASK [osism.commons.cleanup : Disable apt-daily timers] ************************ 2025-09-27 00:29:58.216155 | orchestrator | Saturday 27 September 2025 00:28:56 +0000 (0:00:00.316) 0:04:02.513 **** 2025-09-27 00:29:58.216166 | orchestrator | skipping: [testbed-manager] => (item=apt-daily-upgrade)  2025-09-27 00:29:58.216177 | orchestrator | skipping: [testbed-manager] => (item=apt-daily)  2025-09-27 00:29:58.216189 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily-upgrade)  2025-09-27 00:29:58.216222 | orchestrator | skipping: [testbed-node-3] => (item=apt-daily)  2025-09-27 00:29:58.216234 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:29:58.216245 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily-upgrade)  2025-09-27 00:29:58.216256 | orchestrator | skipping: [testbed-node-4] => (item=apt-daily)  2025-09-27 00:29:58.216267 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:29:58.216277 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily-upgrade)  2025-09-27 00:29:58.216288 | orchestrator | skipping: [testbed-node-5] => (item=apt-daily)  2025-09-27 00:29:58.216299 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:29:58.216310 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily-upgrade)  2025-09-27 00:29:58.216321 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:29:58.216331 | orchestrator | skipping: [testbed-node-0] => (item=apt-daily)  2025-09-27 00:29:58.216342 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily-upgrade)  2025-09-27 00:29:58.216353 | orchestrator | skipping: [testbed-node-1] => (item=apt-daily)  2025-09-27 00:29:58.216363 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:29:58.216374 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:29:58.216385 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily-upgrade)  2025-09-27 00:29:58.216396 | orchestrator | skipping: [testbed-node-2] => (item=apt-daily)  2025-09-27 00:29:58.216406 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:29:58.216417 | orchestrator | 2025-09-27 00:29:58.216428 | orchestrator | TASK [osism.commons.cleanup : Include service tasks] *************************** 2025-09-27 00:29:58.216439 | orchestrator | Saturday 27 September 2025 00:28:56 +0000 (0:00:00.316) 0:04:02.830 **** 2025-09-27 00:29:58.216450 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/services-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:29:58.216462 | orchestrator | 2025-09-27 00:29:58.216473 | orchestrator | TASK [osism.commons.cleanup : Cleanup services] ******************************** 2025-09-27 00:29:58.216484 | orchestrator | Saturday 27 September 2025 00:28:56 +0000 (0:00:00.325) 0:04:03.156 **** 2025-09-27 00:29:58.216494 | orchestrator | skipping: [testbed-manager] => (item=ModemManager.service)  2025-09-27 00:29:58.216505 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:29:58.216516 | orchestrator | skipping: [testbed-node-3] => (item=ModemManager.service)  2025-09-27 00:29:58.216527 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:29:58.216538 | orchestrator | skipping: [testbed-node-4] => (item=ModemManager.service)  2025-09-27 00:29:58.216576 | orchestrator | skipping: [testbed-node-5] => (item=ModemManager.service)  2025-09-27 00:29:58.216588 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:29:58.216599 | orchestrator | skipping: [testbed-node-0] => (item=ModemManager.service)  2025-09-27 00:29:58.216610 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:29:58.216621 | orchestrator | skipping: [testbed-node-1] => (item=ModemManager.service)  2025-09-27 00:29:58.216632 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:29:58.216643 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:29:58.216654 | orchestrator | skipping: [testbed-node-2] => (item=ModemManager.service)  2025-09-27 00:29:58.216665 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:29:58.216676 | orchestrator | 2025-09-27 00:29:58.216687 | orchestrator | TASK [osism.commons.cleanup : Include packages tasks] ************************** 2025-09-27 00:29:58.216698 | orchestrator | Saturday 27 September 2025 00:28:57 +0000 (0:00:00.271) 0:04:03.427 **** 2025-09-27 00:29:58.216710 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/packages-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:29:58.216721 | orchestrator | 2025-09-27 00:29:58.216732 | orchestrator | TASK [osism.commons.cleanup : Cleanup installed packages] ********************** 2025-09-27 00:29:58.216743 | orchestrator | Saturday 27 September 2025 00:28:57 +0000 (0:00:00.318) 0:04:03.746 **** 2025-09-27 00:29:58.216754 | orchestrator | changed: [testbed-manager] 2025-09-27 00:29:58.216765 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:29:58.216776 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:29:58.216787 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:29:58.216798 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:29:58.216809 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:29:58.216819 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:29:58.216830 | orchestrator | 2025-09-27 00:29:58.216841 | orchestrator | TASK [osism.commons.cleanup : Remove cloudinit package] ************************ 2025-09-27 00:29:58.216852 | orchestrator | Saturday 27 September 2025 00:29:31 +0000 (0:00:34.298) 0:04:38.044 **** 2025-09-27 00:29:58.216863 | orchestrator | changed: [testbed-manager] 2025-09-27 00:29:58.216874 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:29:58.216885 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:29:58.216896 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:29:58.216907 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:29:58.216918 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:29:58.216929 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:29:58.216940 | orchestrator | 2025-09-27 00:29:58.216951 | orchestrator | TASK [osism.commons.cleanup : Uninstall unattended-upgrades package] *********** 2025-09-27 00:29:58.216962 | orchestrator | Saturday 27 September 2025 00:29:39 +0000 (0:00:07.970) 0:04:46.015 **** 2025-09-27 00:29:58.216973 | orchestrator | changed: [testbed-manager] 2025-09-27 00:29:58.216984 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:29:58.216995 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:29:58.217005 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:29:58.217016 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:29:58.217027 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:29:58.217037 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:29:58.217048 | orchestrator | 2025-09-27 00:29:58.217059 | orchestrator | TASK [osism.commons.cleanup : Remove useless packages from the cache] ********** 2025-09-27 00:29:58.217070 | orchestrator | Saturday 27 September 2025 00:29:47 +0000 (0:00:07.877) 0:04:53.892 **** 2025-09-27 00:29:58.217082 | orchestrator | ok: [testbed-manager] 2025-09-27 00:29:58.217093 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:29:58.217104 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:29:58.217114 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:29:58.217125 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:29:58.217136 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:29:58.217147 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:29:58.217164 | orchestrator | 2025-09-27 00:29:58.217175 | orchestrator | TASK [osism.commons.cleanup : Remove dependencies that are no longer required] *** 2025-09-27 00:29:58.217196 | orchestrator | Saturday 27 September 2025 00:29:49 +0000 (0:00:01.739) 0:04:55.631 **** 2025-09-27 00:29:58.217224 | orchestrator | changed: [testbed-manager] 2025-09-27 00:29:58.217236 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:29:58.217246 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:29:58.217257 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:29:58.217268 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:29:58.217279 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:29:58.217289 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:29:58.217300 | orchestrator | 2025-09-27 00:29:58.217311 | orchestrator | TASK [osism.commons.cleanup : Include cloudinit tasks] ************************* 2025-09-27 00:29:58.217322 | orchestrator | Saturday 27 September 2025 00:29:55 +0000 (0:00:05.833) 0:05:01.465 **** 2025-09-27 00:29:58.217334 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/cleanup/tasks/cloudinit.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:29:58.217346 | orchestrator | 2025-09-27 00:29:58.217357 | orchestrator | TASK [osism.commons.cleanup : Remove cloud-init configuration directory] ******* 2025-09-27 00:29:58.217368 | orchestrator | Saturday 27 September 2025 00:29:55 +0000 (0:00:00.510) 0:05:01.975 **** 2025-09-27 00:29:58.217379 | orchestrator | changed: [testbed-manager] 2025-09-27 00:29:58.217390 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:29:58.217400 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:29:58.217411 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:29:58.217422 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:29:58.217433 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:29:58.217443 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:29:58.217454 | orchestrator | 2025-09-27 00:29:58.217465 | orchestrator | TASK [osism.commons.timezone : Install tzdata package] ************************* 2025-09-27 00:29:58.217476 | orchestrator | Saturday 27 September 2025 00:29:56 +0000 (0:00:00.767) 0:05:02.743 **** 2025-09-27 00:29:58.217487 | orchestrator | ok: [testbed-manager] 2025-09-27 00:29:58.217497 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:29:58.217508 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:29:58.217519 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:29:58.217537 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:30:14.418484 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:30:14.418597 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:30:14.418614 | orchestrator | 2025-09-27 00:30:14.418627 | orchestrator | TASK [osism.commons.timezone : Set timezone to UTC] **************************** 2025-09-27 00:30:14.418641 | orchestrator | Saturday 27 September 2025 00:29:58 +0000 (0:00:01.701) 0:05:04.444 **** 2025-09-27 00:30:14.418652 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:30:14.418665 | orchestrator | changed: [testbed-manager] 2025-09-27 00:30:14.418676 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:30:14.418687 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:30:14.418697 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:30:14.418709 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:30:14.418720 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:30:14.418749 | orchestrator | 2025-09-27 00:30:14.418760 | orchestrator | TASK [osism.commons.timezone : Create /etc/adjtime file] *********************** 2025-09-27 00:30:14.418783 | orchestrator | Saturday 27 September 2025 00:29:58 +0000 (0:00:00.779) 0:05:05.224 **** 2025-09-27 00:30:14.418808 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:30:14.418820 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:30:14.418831 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:30:14.418842 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:30:14.418852 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:30:14.418863 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:30:14.418874 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:30:14.418885 | orchestrator | 2025-09-27 00:30:14.418896 | orchestrator | TASK [osism.commons.timezone : Ensure UTC in /etc/adjtime] ********************* 2025-09-27 00:30:14.418930 | orchestrator | Saturday 27 September 2025 00:29:59 +0000 (0:00:00.258) 0:05:05.483 **** 2025-09-27 00:30:14.418942 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:30:14.418953 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:30:14.418964 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:30:14.418974 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:30:14.418985 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:30:14.418997 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:30:14.419007 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:30:14.419020 | orchestrator | 2025-09-27 00:30:14.419033 | orchestrator | TASK [osism.services.docker : Gather variables for each operating system] ****** 2025-09-27 00:30:14.419045 | orchestrator | Saturday 27 September 2025 00:29:59 +0000 (0:00:00.349) 0:05:05.832 **** 2025-09-27 00:30:14.419058 | orchestrator | ok: [testbed-manager] 2025-09-27 00:30:14.419070 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:30:14.419083 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:30:14.419095 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:30:14.419107 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:30:14.419119 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:30:14.419132 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:30:14.419144 | orchestrator | 2025-09-27 00:30:14.419157 | orchestrator | TASK [osism.services.docker : Set docker_version variable to default value] **** 2025-09-27 00:30:14.419170 | orchestrator | Saturday 27 September 2025 00:29:59 +0000 (0:00:00.304) 0:05:06.137 **** 2025-09-27 00:30:14.419182 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:30:14.419194 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:30:14.419228 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:30:14.419241 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:30:14.419254 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:30:14.419266 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:30:14.419278 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:30:14.419290 | orchestrator | 2025-09-27 00:30:14.419303 | orchestrator | TASK [osism.services.docker : Set docker_cli_version variable to default value] *** 2025-09-27 00:30:14.419316 | orchestrator | Saturday 27 September 2025 00:30:00 +0000 (0:00:00.254) 0:05:06.392 **** 2025-09-27 00:30:14.419329 | orchestrator | ok: [testbed-manager] 2025-09-27 00:30:14.419341 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:30:14.419353 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:30:14.419365 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:30:14.419378 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:30:14.419389 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:30:14.419400 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:30:14.419410 | orchestrator | 2025-09-27 00:30:14.419421 | orchestrator | TASK [osism.services.docker : Print used docker version] *********************** 2025-09-27 00:30:14.419432 | orchestrator | Saturday 27 September 2025 00:30:00 +0000 (0:00:00.290) 0:05:06.682 **** 2025-09-27 00:30:14.419443 | orchestrator | ok: [testbed-manager] =>  2025-09-27 00:30:14.419454 | orchestrator |  docker_version: 5:27.5.1 2025-09-27 00:30:14.419465 | orchestrator | ok: [testbed-node-3] =>  2025-09-27 00:30:14.419475 | orchestrator |  docker_version: 5:27.5.1 2025-09-27 00:30:14.419486 | orchestrator | ok: [testbed-node-4] =>  2025-09-27 00:30:14.419497 | orchestrator |  docker_version: 5:27.5.1 2025-09-27 00:30:14.419508 | orchestrator | ok: [testbed-node-5] =>  2025-09-27 00:30:14.419518 | orchestrator |  docker_version: 5:27.5.1 2025-09-27 00:30:14.419529 | orchestrator | ok: [testbed-node-0] =>  2025-09-27 00:30:14.419540 | orchestrator |  docker_version: 5:27.5.1 2025-09-27 00:30:14.419550 | orchestrator | ok: [testbed-node-1] =>  2025-09-27 00:30:14.419561 | orchestrator |  docker_version: 5:27.5.1 2025-09-27 00:30:14.419572 | orchestrator | ok: [testbed-node-2] =>  2025-09-27 00:30:14.419582 | orchestrator |  docker_version: 5:27.5.1 2025-09-27 00:30:14.419593 | orchestrator | 2025-09-27 00:30:14.419604 | orchestrator | TASK [osism.services.docker : Print used docker cli version] ******************* 2025-09-27 00:30:14.419615 | orchestrator | Saturday 27 September 2025 00:30:00 +0000 (0:00:00.276) 0:05:06.958 **** 2025-09-27 00:30:14.419634 | orchestrator | ok: [testbed-manager] =>  2025-09-27 00:30:14.419644 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-27 00:30:14.419655 | orchestrator | ok: [testbed-node-3] =>  2025-09-27 00:30:14.419666 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-27 00:30:14.419677 | orchestrator | ok: [testbed-node-4] =>  2025-09-27 00:30:14.419687 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-27 00:30:14.419698 | orchestrator | ok: [testbed-node-5] =>  2025-09-27 00:30:14.419709 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-27 00:30:14.419719 | orchestrator | ok: [testbed-node-0] =>  2025-09-27 00:30:14.419730 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-27 00:30:14.419741 | orchestrator | ok: [testbed-node-1] =>  2025-09-27 00:30:14.419751 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-27 00:30:14.419762 | orchestrator | ok: [testbed-node-2] =>  2025-09-27 00:30:14.419773 | orchestrator |  docker_cli_version: 5:27.5.1 2025-09-27 00:30:14.419783 | orchestrator | 2025-09-27 00:30:14.419794 | orchestrator | TASK [osism.services.docker : Include block storage tasks] ********************* 2025-09-27 00:30:14.419823 | orchestrator | Saturday 27 September 2025 00:30:01 +0000 (0:00:00.294) 0:05:07.253 **** 2025-09-27 00:30:14.419835 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:30:14.419846 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:30:14.419857 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:30:14.419867 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:30:14.419878 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:30:14.419889 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:30:14.419899 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:30:14.419910 | orchestrator | 2025-09-27 00:30:14.419921 | orchestrator | TASK [osism.services.docker : Include zram storage tasks] ********************** 2025-09-27 00:30:14.419932 | orchestrator | Saturday 27 September 2025 00:30:01 +0000 (0:00:00.252) 0:05:07.505 **** 2025-09-27 00:30:14.419943 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:30:14.419954 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:30:14.419964 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:30:14.419975 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:30:14.419986 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:30:14.420002 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:30:14.420013 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:30:14.420024 | orchestrator | 2025-09-27 00:30:14.420035 | orchestrator | TASK [osism.services.docker : Include docker install tasks] ******************** 2025-09-27 00:30:14.420046 | orchestrator | Saturday 27 September 2025 00:30:01 +0000 (0:00:00.255) 0:05:07.760 **** 2025-09-27 00:30:14.420058 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/install-docker-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:30:14.420071 | orchestrator | 2025-09-27 00:30:14.420083 | orchestrator | TASK [osism.services.docker : Remove old architecture-dependent repository] **** 2025-09-27 00:30:14.420094 | orchestrator | Saturday 27 September 2025 00:30:01 +0000 (0:00:00.394) 0:05:08.155 **** 2025-09-27 00:30:14.420104 | orchestrator | ok: [testbed-manager] 2025-09-27 00:30:14.420115 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:30:14.420126 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:30:14.420137 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:30:14.420148 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:30:14.420158 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:30:14.420169 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:30:14.420180 | orchestrator | 2025-09-27 00:30:14.420191 | orchestrator | TASK [osism.services.docker : Gather package facts] **************************** 2025-09-27 00:30:14.420236 | orchestrator | Saturday 27 September 2025 00:30:02 +0000 (0:00:01.003) 0:05:09.158 **** 2025-09-27 00:30:14.420248 | orchestrator | ok: [testbed-manager] 2025-09-27 00:30:14.420259 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:30:14.420270 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:30:14.420287 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:30:14.420298 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:30:14.420308 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:30:14.420319 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:30:14.420329 | orchestrator | 2025-09-27 00:30:14.420341 | orchestrator | TASK [osism.services.docker : Check whether packages are installed that should not be installed] *** 2025-09-27 00:30:14.420353 | orchestrator | Saturday 27 September 2025 00:30:06 +0000 (0:00:03.905) 0:05:13.064 **** 2025-09-27 00:30:14.420364 | orchestrator | skipping: [testbed-manager] => (item=containerd)  2025-09-27 00:30:14.420375 | orchestrator | skipping: [testbed-manager] => (item=docker.io)  2025-09-27 00:30:14.420386 | orchestrator | skipping: [testbed-manager] => (item=docker-engine)  2025-09-27 00:30:14.420396 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:30:14.420407 | orchestrator | skipping: [testbed-node-3] => (item=containerd)  2025-09-27 00:30:14.420418 | orchestrator | skipping: [testbed-node-3] => (item=docker.io)  2025-09-27 00:30:14.420428 | orchestrator | skipping: [testbed-node-3] => (item=docker-engine)  2025-09-27 00:30:14.420439 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:30:14.420450 | orchestrator | skipping: [testbed-node-4] => (item=containerd)  2025-09-27 00:30:14.420460 | orchestrator | skipping: [testbed-node-4] => (item=docker.io)  2025-09-27 00:30:14.420471 | orchestrator | skipping: [testbed-node-4] => (item=docker-engine)  2025-09-27 00:30:14.420481 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:30:14.420492 | orchestrator | skipping: [testbed-node-5] => (item=containerd)  2025-09-27 00:30:14.420503 | orchestrator | skipping: [testbed-node-5] => (item=docker.io)  2025-09-27 00:30:14.420514 | orchestrator | skipping: [testbed-node-5] => (item=docker-engine)  2025-09-27 00:30:14.420524 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:30:14.420535 | orchestrator | skipping: [testbed-node-0] => (item=containerd)  2025-09-27 00:30:14.420545 | orchestrator | skipping: [testbed-node-0] => (item=docker.io)  2025-09-27 00:30:14.420556 | orchestrator | skipping: [testbed-node-0] => (item=docker-engine)  2025-09-27 00:30:14.420567 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:30:14.420577 | orchestrator | skipping: [testbed-node-1] => (item=containerd)  2025-09-27 00:30:14.420588 | orchestrator | skipping: [testbed-node-1] => (item=docker.io)  2025-09-27 00:30:14.420598 | orchestrator | skipping: [testbed-node-1] => (item=docker-engine)  2025-09-27 00:30:14.420609 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:30:14.420619 | orchestrator | skipping: [testbed-node-2] => (item=containerd)  2025-09-27 00:30:14.420630 | orchestrator | skipping: [testbed-node-2] => (item=docker.io)  2025-09-27 00:30:14.420641 | orchestrator | skipping: [testbed-node-2] => (item=docker-engine)  2025-09-27 00:30:14.420651 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:30:14.420662 | orchestrator | 2025-09-27 00:30:14.420673 | orchestrator | TASK [osism.services.docker : Install apt-transport-https package] ************* 2025-09-27 00:30:14.420684 | orchestrator | Saturday 27 September 2025 00:30:07 +0000 (0:00:00.609) 0:05:13.674 **** 2025-09-27 00:30:14.420694 | orchestrator | ok: [testbed-manager] 2025-09-27 00:30:14.420705 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:30:14.420716 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:30:14.420726 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:30:14.420737 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:30:14.420748 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:30:14.420758 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:30:14.420769 | orchestrator | 2025-09-27 00:30:14.420787 | orchestrator | TASK [osism.services.docker : Add repository gpg key] ************************** 2025-09-27 00:31:09.697694 | orchestrator | Saturday 27 September 2025 00:30:14 +0000 (0:00:06.953) 0:05:20.627 **** 2025-09-27 00:31:09.697811 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:09.697828 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.697840 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.697852 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.697890 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.697902 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.697912 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.697923 | orchestrator | 2025-09-27 00:31:09.697935 | orchestrator | TASK [osism.services.docker : Add repository] ********************************** 2025-09-27 00:31:09.697946 | orchestrator | Saturday 27 September 2025 00:30:15 +0000 (0:00:01.390) 0:05:22.017 **** 2025-09-27 00:31:09.697957 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:09.697967 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.697979 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.698004 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.698076 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.698089 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.698100 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.698110 | orchestrator | 2025-09-27 00:31:09.698121 | orchestrator | TASK [osism.services.docker : Update package cache] **************************** 2025-09-27 00:31:09.698132 | orchestrator | Saturday 27 September 2025 00:30:24 +0000 (0:00:08.868) 0:05:30.885 **** 2025-09-27 00:31:09.698143 | orchestrator | changed: [testbed-manager] 2025-09-27 00:31:09.698154 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.698165 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.698176 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.698186 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.698197 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.698239 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.698252 | orchestrator | 2025-09-27 00:31:09.698265 | orchestrator | TASK [osism.services.docker : Pin docker package version] ********************** 2025-09-27 00:31:09.698278 | orchestrator | Saturday 27 September 2025 00:30:27 +0000 (0:00:03.278) 0:05:34.164 **** 2025-09-27 00:31:09.698290 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:09.698303 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.698316 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.698329 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.698341 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.698355 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.698367 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.698380 | orchestrator | 2025-09-27 00:31:09.698392 | orchestrator | TASK [osism.services.docker : Pin docker-cli package version] ****************** 2025-09-27 00:31:09.698404 | orchestrator | Saturday 27 September 2025 00:30:29 +0000 (0:00:01.316) 0:05:35.481 **** 2025-09-27 00:31:09.698417 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:09.698430 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.698442 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.698454 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.698467 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.698479 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.698492 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.698504 | orchestrator | 2025-09-27 00:31:09.698517 | orchestrator | TASK [osism.services.docker : Unlock containerd package] *********************** 2025-09-27 00:31:09.698529 | orchestrator | Saturday 27 September 2025 00:30:30 +0000 (0:00:01.373) 0:05:36.854 **** 2025-09-27 00:31:09.698541 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:09.698554 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:09.698567 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:09.698580 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:09.698591 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:09.698602 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:09.698613 | orchestrator | changed: [testbed-manager] 2025-09-27 00:31:09.698623 | orchestrator | 2025-09-27 00:31:09.698634 | orchestrator | TASK [osism.services.docker : Install containerd package] ********************** 2025-09-27 00:31:09.698645 | orchestrator | Saturday 27 September 2025 00:30:31 +0000 (0:00:00.857) 0:05:37.712 **** 2025-09-27 00:31:09.698656 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:09.698667 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.698687 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.698697 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.698708 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.698719 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.698730 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.698741 | orchestrator | 2025-09-27 00:31:09.698752 | orchestrator | TASK [osism.services.docker : Lock containerd package] ************************* 2025-09-27 00:31:09.698763 | orchestrator | Saturday 27 September 2025 00:30:41 +0000 (0:00:09.882) 0:05:47.595 **** 2025-09-27 00:31:09.698773 | orchestrator | changed: [testbed-manager] 2025-09-27 00:31:09.698784 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.698795 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.698806 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.698816 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.698827 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.698838 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.698848 | orchestrator | 2025-09-27 00:31:09.698859 | orchestrator | TASK [osism.services.docker : Install docker-cli package] ********************** 2025-09-27 00:31:09.698870 | orchestrator | Saturday 27 September 2025 00:30:42 +0000 (0:00:01.001) 0:05:48.596 **** 2025-09-27 00:31:09.698881 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:09.698892 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.698903 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.698914 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.698924 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.698935 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.698946 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.698956 | orchestrator | 2025-09-27 00:31:09.698967 | orchestrator | TASK [osism.services.docker : Install docker package] ************************** 2025-09-27 00:31:09.698978 | orchestrator | Saturday 27 September 2025 00:30:51 +0000 (0:00:08.992) 0:05:57.589 **** 2025-09-27 00:31:09.698989 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:09.699000 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.699011 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.699022 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.699032 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.699043 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.699071 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.699083 | orchestrator | 2025-09-27 00:31:09.699094 | orchestrator | TASK [osism.services.docker : Unblock installation of python docker packages] *** 2025-09-27 00:31:09.699105 | orchestrator | Saturday 27 September 2025 00:31:02 +0000 (0:00:10.876) 0:06:08.465 **** 2025-09-27 00:31:09.699116 | orchestrator | ok: [testbed-manager] => (item=python3-docker) 2025-09-27 00:31:09.699127 | orchestrator | ok: [testbed-node-3] => (item=python3-docker) 2025-09-27 00:31:09.699138 | orchestrator | ok: [testbed-node-4] => (item=python3-docker) 2025-09-27 00:31:09.699149 | orchestrator | ok: [testbed-node-5] => (item=python3-docker) 2025-09-27 00:31:09.699160 | orchestrator | ok: [testbed-manager] => (item=python-docker) 2025-09-27 00:31:09.699171 | orchestrator | ok: [testbed-node-0] => (item=python3-docker) 2025-09-27 00:31:09.699181 | orchestrator | ok: [testbed-node-3] => (item=python-docker) 2025-09-27 00:31:09.699219 | orchestrator | ok: [testbed-node-1] => (item=python3-docker) 2025-09-27 00:31:09.699232 | orchestrator | ok: [testbed-node-2] => (item=python3-docker) 2025-09-27 00:31:09.699243 | orchestrator | ok: [testbed-node-4] => (item=python-docker) 2025-09-27 00:31:09.699254 | orchestrator | ok: [testbed-node-5] => (item=python-docker) 2025-09-27 00:31:09.699265 | orchestrator | ok: [testbed-node-0] => (item=python-docker) 2025-09-27 00:31:09.699275 | orchestrator | ok: [testbed-node-1] => (item=python-docker) 2025-09-27 00:31:09.699286 | orchestrator | ok: [testbed-node-2] => (item=python-docker) 2025-09-27 00:31:09.699297 | orchestrator | 2025-09-27 00:31:09.699308 | orchestrator | TASK [osism.services.docker : Install python3 docker package] ****************** 2025-09-27 00:31:09.699327 | orchestrator | Saturday 27 September 2025 00:31:03 +0000 (0:00:01.292) 0:06:09.758 **** 2025-09-27 00:31:09.699338 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:09.699349 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:09.699360 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:09.699370 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:09.699381 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:09.699392 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:09.699402 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:09.699413 | orchestrator | 2025-09-27 00:31:09.699424 | orchestrator | TASK [osism.services.docker : Install python3 docker package from Debian Sid] *** 2025-09-27 00:31:09.699435 | orchestrator | Saturday 27 September 2025 00:31:04 +0000 (0:00:00.523) 0:06:10.282 **** 2025-09-27 00:31:09.699445 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:09.699456 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:09.699467 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:09.699478 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:09.699488 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:09.699499 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:09.699510 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:09.699520 | orchestrator | 2025-09-27 00:31:09.699531 | orchestrator | TASK [osism.services.docker : Remove python docker packages (install python bindings from pip)] *** 2025-09-27 00:31:09.699544 | orchestrator | Saturday 27 September 2025 00:31:07 +0000 (0:00:03.904) 0:06:14.186 **** 2025-09-27 00:31:09.699555 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:09.699565 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:09.699576 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:09.699586 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:09.699597 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:09.699608 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:09.699618 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:09.699629 | orchestrator | 2025-09-27 00:31:09.699640 | orchestrator | TASK [osism.services.docker : Block installation of python docker packages (install python bindings from pip)] *** 2025-09-27 00:31:09.699652 | orchestrator | Saturday 27 September 2025 00:31:08 +0000 (0:00:00.472) 0:06:14.658 **** 2025-09-27 00:31:09.699663 | orchestrator | skipping: [testbed-manager] => (item=python3-docker)  2025-09-27 00:31:09.699673 | orchestrator | skipping: [testbed-manager] => (item=python-docker)  2025-09-27 00:31:09.699684 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:09.699695 | orchestrator | skipping: [testbed-node-3] => (item=python3-docker)  2025-09-27 00:31:09.699706 | orchestrator | skipping: [testbed-node-3] => (item=python-docker)  2025-09-27 00:31:09.699717 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:09.699728 | orchestrator | skipping: [testbed-node-4] => (item=python3-docker)  2025-09-27 00:31:09.699738 | orchestrator | skipping: [testbed-node-4] => (item=python-docker)  2025-09-27 00:31:09.699749 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:09.699760 | orchestrator | skipping: [testbed-node-5] => (item=python3-docker)  2025-09-27 00:31:09.699770 | orchestrator | skipping: [testbed-node-5] => (item=python-docker)  2025-09-27 00:31:09.699781 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:09.699792 | orchestrator | skipping: [testbed-node-0] => (item=python3-docker)  2025-09-27 00:31:09.699803 | orchestrator | skipping: [testbed-node-0] => (item=python-docker)  2025-09-27 00:31:09.699813 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:09.699824 | orchestrator | skipping: [testbed-node-1] => (item=python3-docker)  2025-09-27 00:31:09.699835 | orchestrator | skipping: [testbed-node-1] => (item=python-docker)  2025-09-27 00:31:09.699845 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:09.699856 | orchestrator | skipping: [testbed-node-2] => (item=python3-docker)  2025-09-27 00:31:09.699867 | orchestrator | skipping: [testbed-node-2] => (item=python-docker)  2025-09-27 00:31:09.699877 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:09.699895 | orchestrator | 2025-09-27 00:31:09.699906 | orchestrator | TASK [osism.services.docker : Install python3-pip package (install python bindings from pip)] *** 2025-09-27 00:31:09.699917 | orchestrator | Saturday 27 September 2025 00:31:09 +0000 (0:00:00.740) 0:06:15.399 **** 2025-09-27 00:31:09.699928 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:09.699938 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:09.699949 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:09.699960 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:09.699971 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:09.699981 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:09.699992 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:09.700003 | orchestrator | 2025-09-27 00:31:09.700020 | orchestrator | TASK [osism.services.docker : Install docker packages (install python bindings from pip)] *** 2025-09-27 00:31:30.537415 | orchestrator | Saturday 27 September 2025 00:31:09 +0000 (0:00:00.531) 0:06:15.930 **** 2025-09-27 00:31:30.537536 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:30.537557 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:30.537570 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:30.537581 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:30.537592 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:30.537603 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:30.537614 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:30.537626 | orchestrator | 2025-09-27 00:31:30.537638 | orchestrator | TASK [osism.services.docker : Install packages required by docker login] ******* 2025-09-27 00:31:30.537649 | orchestrator | Saturday 27 September 2025 00:31:10 +0000 (0:00:00.507) 0:06:16.438 **** 2025-09-27 00:31:30.537660 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:30.537671 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:30.537683 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:30.537693 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:30.537705 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:30.537716 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:30.537726 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:30.537737 | orchestrator | 2025-09-27 00:31:30.537799 | orchestrator | TASK [osism.services.docker : Ensure that some packages are not installed] ***** 2025-09-27 00:31:30.537813 | orchestrator | Saturday 27 September 2025 00:31:10 +0000 (0:00:00.515) 0:06:16.953 **** 2025-09-27 00:31:30.537824 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.537837 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:30.537847 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:30.537858 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:30.537869 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:30.537880 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:30.537891 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:30.537902 | orchestrator | 2025-09-27 00:31:30.537913 | orchestrator | TASK [osism.services.docker : Include config tasks] **************************** 2025-09-27 00:31:30.537924 | orchestrator | Saturday 27 September 2025 00:31:12 +0000 (0:00:01.708) 0:06:18.662 **** 2025-09-27 00:31:30.537936 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/config.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:31:30.537951 | orchestrator | 2025-09-27 00:31:30.537964 | orchestrator | TASK [osism.services.docker : Create plugins directory] ************************ 2025-09-27 00:31:30.537976 | orchestrator | Saturday 27 September 2025 00:31:13 +0000 (0:00:00.980) 0:06:19.643 **** 2025-09-27 00:31:30.537988 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.538001 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:30.538080 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:30.538094 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:30.538107 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:30.538120 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:30.538132 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:30.538145 | orchestrator | 2025-09-27 00:31:30.538179 | orchestrator | TASK [osism.services.docker : Create systemd overlay directory] **************** 2025-09-27 00:31:30.538193 | orchestrator | Saturday 27 September 2025 00:31:14 +0000 (0:00:00.831) 0:06:20.474 **** 2025-09-27 00:31:30.538229 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.538242 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:30.538254 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:30.538266 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:30.538279 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:30.538291 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:30.538303 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:30.538314 | orchestrator | 2025-09-27 00:31:30.538325 | orchestrator | TASK [osism.services.docker : Copy systemd overlay file] *********************** 2025-09-27 00:31:30.538336 | orchestrator | Saturday 27 September 2025 00:31:15 +0000 (0:00:00.844) 0:06:21.319 **** 2025-09-27 00:31:30.538347 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.538358 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:30.538369 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:30.538380 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:30.538390 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:30.538401 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:30.538412 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:30.538423 | orchestrator | 2025-09-27 00:31:30.538434 | orchestrator | TASK [osism.services.docker : Reload systemd daemon if systemd overlay file is changed] *** 2025-09-27 00:31:30.538446 | orchestrator | Saturday 27 September 2025 00:31:16 +0000 (0:00:01.467) 0:06:22.786 **** 2025-09-27 00:31:30.538457 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:30.538468 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:30.538479 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:30.538489 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:30.538500 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:30.538511 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:30.538521 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:30.538532 | orchestrator | 2025-09-27 00:31:30.538543 | orchestrator | TASK [osism.services.docker : Copy limits configuration file] ****************** 2025-09-27 00:31:30.538554 | orchestrator | Saturday 27 September 2025 00:31:17 +0000 (0:00:01.442) 0:06:24.229 **** 2025-09-27 00:31:30.538565 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.538576 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:30.538587 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:30.538597 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:30.538608 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:30.538619 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:30.538629 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:30.538640 | orchestrator | 2025-09-27 00:31:30.538651 | orchestrator | TASK [osism.services.docker : Copy daemon.json configuration file] ************* 2025-09-27 00:31:30.538662 | orchestrator | Saturday 27 September 2025 00:31:19 +0000 (0:00:01.303) 0:06:25.533 **** 2025-09-27 00:31:30.538673 | orchestrator | changed: [testbed-manager] 2025-09-27 00:31:30.538684 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:30.538695 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:30.538705 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:30.538716 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:30.538727 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:30.538738 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:30.538748 | orchestrator | 2025-09-27 00:31:30.538777 | orchestrator | TASK [osism.services.docker : Include service tasks] *************************** 2025-09-27 00:31:30.538789 | orchestrator | Saturday 27 September 2025 00:31:20 +0000 (0:00:01.466) 0:06:26.999 **** 2025-09-27 00:31:30.538801 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/service.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:31:30.538812 | orchestrator | 2025-09-27 00:31:30.538823 | orchestrator | TASK [osism.services.docker : Reload systemd daemon] *************************** 2025-09-27 00:31:30.538842 | orchestrator | Saturday 27 September 2025 00:31:21 +0000 (0:00:01.125) 0:06:28.125 **** 2025-09-27 00:31:30.538853 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.538864 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:30.538875 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:30.538886 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:30.538897 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:30.538907 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:30.538918 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:30.538929 | orchestrator | 2025-09-27 00:31:30.538940 | orchestrator | TASK [osism.services.docker : Manage service] ********************************** 2025-09-27 00:31:30.538951 | orchestrator | Saturday 27 September 2025 00:31:23 +0000 (0:00:01.519) 0:06:29.644 **** 2025-09-27 00:31:30.538962 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.538973 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:30.538983 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:30.538994 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:30.539005 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:30.539016 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:30.539026 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:30.539037 | orchestrator | 2025-09-27 00:31:30.539048 | orchestrator | TASK [osism.services.docker : Manage docker socket service] ******************** 2025-09-27 00:31:30.539059 | orchestrator | Saturday 27 September 2025 00:31:24 +0000 (0:00:01.185) 0:06:30.830 **** 2025-09-27 00:31:30.539070 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.539081 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:30.539091 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:30.539102 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:30.539113 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:30.539124 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:30.539134 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:30.539145 | orchestrator | 2025-09-27 00:31:30.539156 | orchestrator | TASK [osism.services.docker : Manage containerd service] *********************** 2025-09-27 00:31:30.539167 | orchestrator | Saturday 27 September 2025 00:31:25 +0000 (0:00:01.143) 0:06:31.974 **** 2025-09-27 00:31:30.539178 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:30.539189 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:30.539230 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:30.539250 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:30.539266 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:30.539282 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:30.539299 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:30.539315 | orchestrator | 2025-09-27 00:31:30.539331 | orchestrator | TASK [osism.services.docker : Include bootstrap tasks] ************************* 2025-09-27 00:31:30.539348 | orchestrator | Saturday 27 September 2025 00:31:26 +0000 (0:00:01.141) 0:06:33.115 **** 2025-09-27 00:31:30.539366 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/bootstrap.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:31:30.539384 | orchestrator | 2025-09-27 00:31:30.539403 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-27 00:31:30.539415 | orchestrator | Saturday 27 September 2025 00:31:27 +0000 (0:00:00.990) 0:06:34.106 **** 2025-09-27 00:31:30.539426 | orchestrator | 2025-09-27 00:31:30.539437 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-27 00:31:30.539448 | orchestrator | Saturday 27 September 2025 00:31:27 +0000 (0:00:00.038) 0:06:34.144 **** 2025-09-27 00:31:30.539458 | orchestrator | 2025-09-27 00:31:30.539469 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-27 00:31:30.539480 | orchestrator | Saturday 27 September 2025 00:31:27 +0000 (0:00:00.043) 0:06:34.187 **** 2025-09-27 00:31:30.539491 | orchestrator | 2025-09-27 00:31:30.539501 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-27 00:31:30.539512 | orchestrator | Saturday 27 September 2025 00:31:27 +0000 (0:00:00.037) 0:06:34.225 **** 2025-09-27 00:31:30.539535 | orchestrator | 2025-09-27 00:31:30.539547 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-27 00:31:30.539558 | orchestrator | Saturday 27 September 2025 00:31:28 +0000 (0:00:00.037) 0:06:34.262 **** 2025-09-27 00:31:30.539568 | orchestrator | 2025-09-27 00:31:30.539579 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-27 00:31:30.539590 | orchestrator | Saturday 27 September 2025 00:31:28 +0000 (0:00:00.042) 0:06:34.305 **** 2025-09-27 00:31:30.539601 | orchestrator | 2025-09-27 00:31:30.539612 | orchestrator | TASK [osism.services.docker : Flush handlers] ********************************** 2025-09-27 00:31:30.539623 | orchestrator | Saturday 27 September 2025 00:31:28 +0000 (0:00:00.037) 0:06:34.342 **** 2025-09-27 00:31:30.539633 | orchestrator | 2025-09-27 00:31:30.539644 | orchestrator | RUNNING HANDLER [osism.commons.repository : Force update of package cache] ***** 2025-09-27 00:31:30.539655 | orchestrator | Saturday 27 September 2025 00:31:28 +0000 (0:00:00.037) 0:06:34.380 **** 2025-09-27 00:31:30.539666 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:30.539677 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:30.539688 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:30.539698 | orchestrator | 2025-09-27 00:31:30.539709 | orchestrator | RUNNING HANDLER [osism.services.rsyslog : Restart rsyslog service] ************* 2025-09-27 00:31:30.539720 | orchestrator | Saturday 27 September 2025 00:31:29 +0000 (0:00:01.107) 0:06:35.488 **** 2025-09-27 00:31:30.539731 | orchestrator | changed: [testbed-manager] 2025-09-27 00:31:30.539742 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:30.539753 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:30.539764 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:30.539775 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:30.539793 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:58.283833 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:58.283950 | orchestrator | 2025-09-27 00:31:58.283968 | orchestrator | RUNNING HANDLER [osism.services.docker : Restart docker service] *************** 2025-09-27 00:31:58.283981 | orchestrator | Saturday 27 September 2025 00:31:30 +0000 (0:00:01.278) 0:06:36.767 **** 2025-09-27 00:31:58.283993 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:58.284004 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:58.284015 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:58.284026 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:58.284037 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:58.284047 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:58.284058 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:58.284069 | orchestrator | 2025-09-27 00:31:58.284096 | orchestrator | RUNNING HANDLER [osism.services.docker : Wait after docker service restart] **** 2025-09-27 00:31:58.284107 | orchestrator | Saturday 27 September 2025 00:31:32 +0000 (0:00:02.472) 0:06:39.240 **** 2025-09-27 00:31:58.284118 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:58.284129 | orchestrator | 2025-09-27 00:31:58.284140 | orchestrator | TASK [osism.services.docker : Add user to docker group] ************************ 2025-09-27 00:31:58.284151 | orchestrator | Saturday 27 September 2025 00:31:33 +0000 (0:00:00.107) 0:06:39.347 **** 2025-09-27 00:31:58.284162 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.284174 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:58.284185 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:58.284196 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:58.284240 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:58.284251 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:58.284262 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:58.284273 | orchestrator | 2025-09-27 00:31:58.284284 | orchestrator | TASK [osism.services.docker : Log into private registry and force re-authorization] *** 2025-09-27 00:31:58.284296 | orchestrator | Saturday 27 September 2025 00:31:34 +0000 (0:00:01.020) 0:06:40.368 **** 2025-09-27 00:31:58.284307 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:58.284318 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:58.284349 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:58.284360 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:58.284371 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:58.284382 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:58.284392 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:58.284403 | orchestrator | 2025-09-27 00:31:58.284414 | orchestrator | TASK [osism.services.docker : Include facts tasks] ***************************** 2025-09-27 00:31:58.284425 | orchestrator | Saturday 27 September 2025 00:31:34 +0000 (0:00:00.550) 0:06:40.919 **** 2025-09-27 00:31:58.284437 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/docker/tasks/facts.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:31:58.284450 | orchestrator | 2025-09-27 00:31:58.284462 | orchestrator | TASK [osism.services.docker : Create facts directory] ************************** 2025-09-27 00:31:58.284473 | orchestrator | Saturday 27 September 2025 00:31:35 +0000 (0:00:01.027) 0:06:41.947 **** 2025-09-27 00:31:58.284483 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.284494 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:58.284505 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:58.284516 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:58.284527 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:58.284538 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:58.284549 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:58.284559 | orchestrator | 2025-09-27 00:31:58.284570 | orchestrator | TASK [osism.services.docker : Copy docker fact files] ************************** 2025-09-27 00:31:58.284581 | orchestrator | Saturday 27 September 2025 00:31:36 +0000 (0:00:00.947) 0:06:42.894 **** 2025-09-27 00:31:58.284592 | orchestrator | ok: [testbed-manager] => (item=docker_containers) 2025-09-27 00:31:58.284603 | orchestrator | changed: [testbed-node-3] => (item=docker_containers) 2025-09-27 00:31:58.284614 | orchestrator | changed: [testbed-node-5] => (item=docker_containers) 2025-09-27 00:31:58.284625 | orchestrator | changed: [testbed-node-4] => (item=docker_containers) 2025-09-27 00:31:58.284635 | orchestrator | changed: [testbed-node-0] => (item=docker_containers) 2025-09-27 00:31:58.284646 | orchestrator | changed: [testbed-node-1] => (item=docker_containers) 2025-09-27 00:31:58.284657 | orchestrator | ok: [testbed-manager] => (item=docker_images) 2025-09-27 00:31:58.284667 | orchestrator | changed: [testbed-node-2] => (item=docker_containers) 2025-09-27 00:31:58.284678 | orchestrator | changed: [testbed-node-3] => (item=docker_images) 2025-09-27 00:31:58.284689 | orchestrator | changed: [testbed-node-5] => (item=docker_images) 2025-09-27 00:31:58.284700 | orchestrator | changed: [testbed-node-4] => (item=docker_images) 2025-09-27 00:31:58.284710 | orchestrator | changed: [testbed-node-0] => (item=docker_images) 2025-09-27 00:31:58.284721 | orchestrator | changed: [testbed-node-1] => (item=docker_images) 2025-09-27 00:31:58.284731 | orchestrator | changed: [testbed-node-2] => (item=docker_images) 2025-09-27 00:31:58.284742 | orchestrator | 2025-09-27 00:31:58.284753 | orchestrator | TASK [osism.commons.docker_compose : This install type is not supported] ******* 2025-09-27 00:31:58.284764 | orchestrator | Saturday 27 September 2025 00:31:39 +0000 (0:00:02.631) 0:06:45.526 **** 2025-09-27 00:31:58.284774 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:58.284785 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:58.284796 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:58.284806 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:58.284817 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:58.284828 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:58.284838 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:58.284849 | orchestrator | 2025-09-27 00:31:58.284860 | orchestrator | TASK [osism.commons.docker_compose : Include distribution specific install tasks] *** 2025-09-27 00:31:58.284872 | orchestrator | Saturday 27 September 2025 00:31:39 +0000 (0:00:00.522) 0:06:46.048 **** 2025-09-27 00:31:58.284902 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/docker_compose/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:31:58.284924 | orchestrator | 2025-09-27 00:31:58.284936 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose apt preferences file] *** 2025-09-27 00:31:58.284947 | orchestrator | Saturday 27 September 2025 00:31:40 +0000 (0:00:00.973) 0:06:47.021 **** 2025-09-27 00:31:58.284957 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.284969 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:58.284979 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:58.284990 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:58.285001 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:58.285012 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:58.285022 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:58.285033 | orchestrator | 2025-09-27 00:31:58.285049 | orchestrator | TASK [osism.commons.docker_compose : Get checksum of docker-compose file] ****** 2025-09-27 00:31:58.285060 | orchestrator | Saturday 27 September 2025 00:31:41 +0000 (0:00:00.910) 0:06:47.932 **** 2025-09-27 00:31:58.285071 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.285082 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:58.285093 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:58.285103 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:58.285114 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:58.285124 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:58.285135 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:58.285145 | orchestrator | 2025-09-27 00:31:58.285156 | orchestrator | TASK [osism.commons.docker_compose : Remove docker-compose binary] ************* 2025-09-27 00:31:58.285167 | orchestrator | Saturday 27 September 2025 00:31:42 +0000 (0:00:00.856) 0:06:48.788 **** 2025-09-27 00:31:58.285178 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:58.285189 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:58.285221 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:58.285233 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:58.285243 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:58.285254 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:58.285265 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:58.285276 | orchestrator | 2025-09-27 00:31:58.285287 | orchestrator | TASK [osism.commons.docker_compose : Uninstall docker-compose package] ********* 2025-09-27 00:31:58.285297 | orchestrator | Saturday 27 September 2025 00:31:43 +0000 (0:00:00.475) 0:06:49.264 **** 2025-09-27 00:31:58.285308 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.285319 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:58.285330 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:58.285341 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:31:58.285352 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:58.285362 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:31:58.285373 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:31:58.285383 | orchestrator | 2025-09-27 00:31:58.285394 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose script] *************** 2025-09-27 00:31:58.285405 | orchestrator | Saturday 27 September 2025 00:31:44 +0000 (0:00:01.700) 0:06:50.964 **** 2025-09-27 00:31:58.285416 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:31:58.285427 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:31:58.285438 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:31:58.285449 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:31:58.285460 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:31:58.285470 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:31:58.285481 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:31:58.285491 | orchestrator | 2025-09-27 00:31:58.285502 | orchestrator | TASK [osism.commons.docker_compose : Install docker-compose-plugin package] **** 2025-09-27 00:31:58.285513 | orchestrator | Saturday 27 September 2025 00:31:45 +0000 (0:00:00.487) 0:06:51.452 **** 2025-09-27 00:31:58.285524 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.285535 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:58.285546 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:58.285563 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:58.285574 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:58.285585 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:58.285595 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:58.285606 | orchestrator | 2025-09-27 00:31:58.285617 | orchestrator | TASK [osism.commons.docker_compose : Copy osism.target systemd file] *********** 2025-09-27 00:31:58.285627 | orchestrator | Saturday 27 September 2025 00:31:52 +0000 (0:00:07.061) 0:06:58.514 **** 2025-09-27 00:31:58.285638 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.285649 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:58.285660 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:58.285670 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:58.285681 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:58.285697 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:58.285715 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:58.285732 | orchestrator | 2025-09-27 00:31:58.285762 | orchestrator | TASK [osism.commons.docker_compose : Enable osism.target] ********************** 2025-09-27 00:31:58.285780 | orchestrator | Saturday 27 September 2025 00:31:53 +0000 (0:00:01.242) 0:06:59.756 **** 2025-09-27 00:31:58.285797 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.285814 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:58.285832 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:58.285849 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:58.285866 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:58.285883 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:58.285901 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:58.285918 | orchestrator | 2025-09-27 00:31:58.285935 | orchestrator | TASK [osism.commons.docker_compose : Copy docker-compose systemd unit file] **** 2025-09-27 00:31:58.285954 | orchestrator | Saturday 27 September 2025 00:31:55 +0000 (0:00:01.742) 0:07:01.499 **** 2025-09-27 00:31:58.285973 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:31:58.285991 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:31:58.286007 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:31:58.286085 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:31:58.286097 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.286108 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:31:58.286119 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:31:58.286130 | orchestrator | 2025-09-27 00:31:58.286141 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-09-27 00:31:58.286152 | orchestrator | Saturday 27 September 2025 00:31:57 +0000 (0:00:02.161) 0:07:03.661 **** 2025-09-27 00:31:58.286163 | orchestrator | ok: [testbed-manager] 2025-09-27 00:31:58.286174 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:31:58.286184 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:31:58.286195 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:31:58.286246 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.832535 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.832652 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.832667 | orchestrator | 2025-09-27 00:32:29.832680 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-09-27 00:32:29.832693 | orchestrator | Saturday 27 September 2025 00:31:58 +0000 (0:00:00.853) 0:07:04.515 **** 2025-09-27 00:32:29.832704 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:32:29.832716 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:32:29.832728 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:32:29.832739 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:32:29.832750 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:32:29.832761 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:32:29.832772 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:32:29.832782 | orchestrator | 2025-09-27 00:32:29.832812 | orchestrator | TASK [osism.services.chrony : Check minimum and maximum number of servers] ***** 2025-09-27 00:32:29.832823 | orchestrator | Saturday 27 September 2025 00:31:59 +0000 (0:00:00.945) 0:07:05.460 **** 2025-09-27 00:32:29.832834 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:32:29.832870 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:32:29.832882 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:32:29.832893 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:32:29.832903 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:32:29.832914 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:32:29.832925 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:32:29.832935 | orchestrator | 2025-09-27 00:32:29.832946 | orchestrator | TASK [osism.services.chrony : Gather variables for each operating system] ****** 2025-09-27 00:32:29.832957 | orchestrator | Saturday 27 September 2025 00:31:59 +0000 (0:00:00.522) 0:07:05.983 **** 2025-09-27 00:32:29.832968 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:29.832979 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.832990 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.833001 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.833012 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.833022 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.833033 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.833043 | orchestrator | 2025-09-27 00:32:29.833054 | orchestrator | TASK [osism.services.chrony : Set chrony_conf_file variable to default value] *** 2025-09-27 00:32:29.833067 | orchestrator | Saturday 27 September 2025 00:32:00 +0000 (0:00:00.540) 0:07:06.524 **** 2025-09-27 00:32:29.833081 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:29.833094 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.833107 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.833119 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.833131 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.833143 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.833156 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.833168 | orchestrator | 2025-09-27 00:32:29.833181 | orchestrator | TASK [osism.services.chrony : Set chrony_key_file variable to default value] *** 2025-09-27 00:32:29.833193 | orchestrator | Saturday 27 September 2025 00:32:00 +0000 (0:00:00.487) 0:07:07.011 **** 2025-09-27 00:32:29.833240 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:29.833254 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.833266 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.833277 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.833288 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.833299 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.833309 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.833320 | orchestrator | 2025-09-27 00:32:29.833331 | orchestrator | TASK [osism.services.chrony : Populate service facts] ************************** 2025-09-27 00:32:29.833342 | orchestrator | Saturday 27 September 2025 00:32:01 +0000 (0:00:00.497) 0:07:07.509 **** 2025-09-27 00:32:29.833352 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:29.833363 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.833374 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.833384 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.833395 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.833405 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.833416 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.833426 | orchestrator | 2025-09-27 00:32:29.833437 | orchestrator | TASK [osism.services.chrony : Manage timesyncd service] ************************ 2025-09-27 00:32:29.833448 | orchestrator | Saturday 27 September 2025 00:32:07 +0000 (0:00:05.841) 0:07:13.350 **** 2025-09-27 00:32:29.833459 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:32:29.833470 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:32:29.833481 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:32:29.833492 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:32:29.833503 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:32:29.833514 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:32:29.833524 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:32:29.833535 | orchestrator | 2025-09-27 00:32:29.833546 | orchestrator | TASK [osism.services.chrony : Include distribution specific install tasks] ***** 2025-09-27 00:32:29.833557 | orchestrator | Saturday 27 September 2025 00:32:07 +0000 (0:00:00.557) 0:07:13.907 **** 2025-09-27 00:32:29.833579 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:32:29.833594 | orchestrator | 2025-09-27 00:32:29.833605 | orchestrator | TASK [osism.services.chrony : Install package] ********************************* 2025-09-27 00:32:29.833616 | orchestrator | Saturday 27 September 2025 00:32:08 +0000 (0:00:00.851) 0:07:14.759 **** 2025-09-27 00:32:29.833627 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.833638 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.833648 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:29.833659 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.833670 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.833680 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.833691 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.833702 | orchestrator | 2025-09-27 00:32:29.833713 | orchestrator | TASK [osism.services.chrony : Manage chrony service] *************************** 2025-09-27 00:32:29.833723 | orchestrator | Saturday 27 September 2025 00:32:10 +0000 (0:00:02.328) 0:07:17.087 **** 2025-09-27 00:32:29.833734 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:29.833745 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.833756 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.833766 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.833777 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.833787 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.833798 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.833809 | orchestrator | 2025-09-27 00:32:29.833844 | orchestrator | TASK [osism.services.chrony : Check if configuration file exists] ************** 2025-09-27 00:32:29.833856 | orchestrator | Saturday 27 September 2025 00:32:11 +0000 (0:00:01.155) 0:07:18.243 **** 2025-09-27 00:32:29.833867 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:29.833878 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.833889 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.833899 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.833910 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.833921 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.833931 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.833942 | orchestrator | 2025-09-27 00:32:29.833953 | orchestrator | TASK [osism.services.chrony : Copy configuration file] ************************* 2025-09-27 00:32:29.833964 | orchestrator | Saturday 27 September 2025 00:32:12 +0000 (0:00:00.914) 0:07:19.157 **** 2025-09-27 00:32:29.833975 | orchestrator | changed: [testbed-manager] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-27 00:32:29.833988 | orchestrator | changed: [testbed-node-3] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-27 00:32:29.833999 | orchestrator | changed: [testbed-node-4] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-27 00:32:29.834010 | orchestrator | changed: [testbed-node-5] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-27 00:32:29.834094 | orchestrator | changed: [testbed-node-0] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-27 00:32:29.834139 | orchestrator | changed: [testbed-node-1] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-27 00:32:29.834151 | orchestrator | changed: [testbed-node-2] => (item=/usr/share/ansible/collections/ansible_collections/osism/services/roles/chrony/templates/chrony.conf.j2) 2025-09-27 00:32:29.834162 | orchestrator | 2025-09-27 00:32:29.834174 | orchestrator | TASK [osism.services.lldpd : Include distribution specific install tasks] ****** 2025-09-27 00:32:29.834184 | orchestrator | Saturday 27 September 2025 00:32:14 +0000 (0:00:01.679) 0:07:20.837 **** 2025-09-27 00:32:29.834243 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/lldpd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:32:29.834257 | orchestrator | 2025-09-27 00:32:29.834268 | orchestrator | TASK [osism.services.lldpd : Install lldpd package] **************************** 2025-09-27 00:32:29.834279 | orchestrator | Saturday 27 September 2025 00:32:15 +0000 (0:00:00.997) 0:07:21.835 **** 2025-09-27 00:32:29.834290 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:29.834301 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:29.834312 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:29.834323 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:29.834334 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:29.834345 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:29.834356 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:29.834366 | orchestrator | 2025-09-27 00:32:29.834377 | orchestrator | TASK [osism.services.lldpd : Manage lldpd service] ***************************** 2025-09-27 00:32:29.834388 | orchestrator | Saturday 27 September 2025 00:32:24 +0000 (0:00:09.391) 0:07:31.226 **** 2025-09-27 00:32:29.834399 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:29.834410 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.834421 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.834432 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.834442 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.834453 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.834464 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.834474 | orchestrator | 2025-09-27 00:32:29.834485 | orchestrator | RUNNING HANDLER [osism.commons.docker_compose : Reload systemd daemon] ********* 2025-09-27 00:32:29.834496 | orchestrator | Saturday 27 September 2025 00:32:26 +0000 (0:00:01.846) 0:07:33.073 **** 2025-09-27 00:32:29.834507 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:29.834518 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:29.834528 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:29.834539 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:29.834550 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:29.834560 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:29.834571 | orchestrator | 2025-09-27 00:32:29.834582 | orchestrator | RUNNING HANDLER [osism.services.chrony : Restart chrony service] *************** 2025-09-27 00:32:29.834593 | orchestrator | Saturday 27 September 2025 00:32:28 +0000 (0:00:01.306) 0:07:34.380 **** 2025-09-27 00:32:29.834604 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:29.834615 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:29.834626 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:29.834637 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:29.834647 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:29.834658 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:29.834669 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:29.834680 | orchestrator | 2025-09-27 00:32:29.834691 | orchestrator | PLAY [Apply bootstrap role part 2] ********************************************* 2025-09-27 00:32:29.834702 | orchestrator | 2025-09-27 00:32:29.834712 | orchestrator | TASK [Include hardening role] ************************************************** 2025-09-27 00:32:29.834723 | orchestrator | Saturday 27 September 2025 00:32:29 +0000 (0:00:01.191) 0:07:35.571 **** 2025-09-27 00:32:29.834734 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:32:29.834745 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:32:29.834756 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:32:29.834767 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:32:29.834778 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:32:29.834788 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:32:29.834808 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:32:55.940436 | orchestrator | 2025-09-27 00:32:55.940535 | orchestrator | PLAY [Apply bootstrap roles part 3] ******************************************** 2025-09-27 00:32:55.940547 | orchestrator | 2025-09-27 00:32:55.940554 | orchestrator | TASK [osism.services.journald : Copy configuration file] *********************** 2025-09-27 00:32:55.940580 | orchestrator | Saturday 27 September 2025 00:32:29 +0000 (0:00:00.494) 0:07:36.066 **** 2025-09-27 00:32:55.940587 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.940595 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.940601 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.940607 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.940613 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.940619 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.940638 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.940645 | orchestrator | 2025-09-27 00:32:55.940651 | orchestrator | TASK [osism.services.journald : Manage journald service] *********************** 2025-09-27 00:32:55.940657 | orchestrator | Saturday 27 September 2025 00:32:31 +0000 (0:00:01.336) 0:07:37.402 **** 2025-09-27 00:32:55.940663 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:55.940670 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:55.940676 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:55.940682 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:55.940688 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:55.940694 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:55.940700 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:55.940705 | orchestrator | 2025-09-27 00:32:55.940711 | orchestrator | TASK [Include auditd role] ***************************************************** 2025-09-27 00:32:55.940717 | orchestrator | Saturday 27 September 2025 00:32:32 +0000 (0:00:01.697) 0:07:39.100 **** 2025-09-27 00:32:55.940723 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:32:55.940730 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:32:55.940736 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:32:55.940742 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:32:55.940748 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:32:55.940754 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:32:55.940760 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:32:55.940766 | orchestrator | 2025-09-27 00:32:55.940773 | orchestrator | TASK [Include smartd role] ***************************************************** 2025-09-27 00:32:55.940779 | orchestrator | Saturday 27 September 2025 00:32:33 +0000 (0:00:00.497) 0:07:39.598 **** 2025-09-27 00:32:55.940785 | orchestrator | included: osism.services.smartd for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:32:55.940793 | orchestrator | 2025-09-27 00:32:55.940799 | orchestrator | TASK [osism.services.smartd : Include distribution specific install tasks] ***** 2025-09-27 00:32:55.940806 | orchestrator | Saturday 27 September 2025 00:32:34 +0000 (0:00:00.946) 0:07:40.545 **** 2025-09-27 00:32:55.940814 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/smartd/tasks/install-Debian-family.yml for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:32:55.940823 | orchestrator | 2025-09-27 00:32:55.940829 | orchestrator | TASK [osism.services.smartd : Install smartmontools package] ******************* 2025-09-27 00:32:55.940836 | orchestrator | Saturday 27 September 2025 00:32:35 +0000 (0:00:00.777) 0:07:41.322 **** 2025-09-27 00:32:55.940842 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.940848 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.940854 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.940861 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.940867 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.940873 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.940880 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.940886 | orchestrator | 2025-09-27 00:32:55.940892 | orchestrator | TASK [osism.services.smartd : Create /var/log/smartd directory] **************** 2025-09-27 00:32:55.940898 | orchestrator | Saturday 27 September 2025 00:32:43 +0000 (0:00:08.473) 0:07:49.796 **** 2025-09-27 00:32:55.940905 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.940911 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.940917 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.940929 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.940936 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.940942 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.940948 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.940954 | orchestrator | 2025-09-27 00:32:55.940961 | orchestrator | TASK [osism.services.smartd : Copy smartmontools configuration file] *********** 2025-09-27 00:32:55.940967 | orchestrator | Saturday 27 September 2025 00:32:44 +0000 (0:00:00.788) 0:07:50.584 **** 2025-09-27 00:32:55.940973 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.940979 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.940985 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.940992 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.940998 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.941004 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.941010 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.941016 | orchestrator | 2025-09-27 00:32:55.941022 | orchestrator | TASK [osism.services.smartd : Manage smartd service] *************************** 2025-09-27 00:32:55.941028 | orchestrator | Saturday 27 September 2025 00:32:45 +0000 (0:00:01.516) 0:07:52.101 **** 2025-09-27 00:32:55.941034 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.941040 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.941045 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.941052 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.941058 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.941064 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.941070 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.941075 | orchestrator | 2025-09-27 00:32:55.941081 | orchestrator | RUNNING HANDLER [osism.services.journald : Restart journald service] *********** 2025-09-27 00:32:55.941087 | orchestrator | Saturday 27 September 2025 00:32:47 +0000 (0:00:01.725) 0:07:53.827 **** 2025-09-27 00:32:55.941093 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.941099 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.941104 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.941109 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.941152 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.941168 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.941175 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.941189 | orchestrator | 2025-09-27 00:32:55.941218 | orchestrator | RUNNING HANDLER [osism.services.smartd : Restart smartd service] *************** 2025-09-27 00:32:55.941225 | orchestrator | Saturday 27 September 2025 00:32:48 +0000 (0:00:01.229) 0:07:55.056 **** 2025-09-27 00:32:55.941231 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.941238 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.941244 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.941250 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.941255 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.941261 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.941284 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.941291 | orchestrator | 2025-09-27 00:32:55.941297 | orchestrator | PLAY [Set state bootstrap] ***************************************************** 2025-09-27 00:32:55.941302 | orchestrator | 2025-09-27 00:32:55.941309 | orchestrator | TASK [Set osism.bootstrap.status fact] ***************************************** 2025-09-27 00:32:55.941316 | orchestrator | Saturday 27 September 2025 00:32:50 +0000 (0:00:01.314) 0:07:56.370 **** 2025-09-27 00:32:55.941322 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:32:55.941329 | orchestrator | 2025-09-27 00:32:55.941335 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2025-09-27 00:32:55.941341 | orchestrator | Saturday 27 September 2025 00:32:50 +0000 (0:00:00.777) 0:07:57.148 **** 2025-09-27 00:32:55.941347 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:55.941353 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:55.941368 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:55.941375 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:55.941381 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:55.941387 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:55.941393 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:55.941399 | orchestrator | 2025-09-27 00:32:55.941405 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2025-09-27 00:32:55.941412 | orchestrator | Saturday 27 September 2025 00:32:51 +0000 (0:00:00.827) 0:07:57.976 **** 2025-09-27 00:32:55.941418 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.941424 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.941430 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.941436 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.941442 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.941449 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.941455 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.941461 | orchestrator | 2025-09-27 00:32:55.941467 | orchestrator | TASK [Set osism.bootstrap.timestamp fact] ************************************** 2025-09-27 00:32:55.941473 | orchestrator | Saturday 27 September 2025 00:32:52 +0000 (0:00:01.259) 0:07:59.236 **** 2025-09-27 00:32:55.941480 | orchestrator | included: osism.commons.state for testbed-manager, testbed-node-3, testbed-node-4, testbed-node-5, testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:32:55.941485 | orchestrator | 2025-09-27 00:32:55.941491 | orchestrator | TASK [osism.commons.state : Create custom facts directory] ********************* 2025-09-27 00:32:55.941498 | orchestrator | Saturday 27 September 2025 00:32:53 +0000 (0:00:00.812) 0:08:00.049 **** 2025-09-27 00:32:55.941503 | orchestrator | ok: [testbed-manager] 2025-09-27 00:32:55.941509 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:32:55.941515 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:32:55.941521 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:32:55.941528 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:32:55.941534 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:32:55.941540 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:32:55.941547 | orchestrator | 2025-09-27 00:32:55.941553 | orchestrator | TASK [osism.commons.state : Write state into file] ***************************** 2025-09-27 00:32:55.941558 | orchestrator | Saturday 27 September 2025 00:32:54 +0000 (0:00:00.836) 0:08:00.886 **** 2025-09-27 00:32:55.941564 | orchestrator | changed: [testbed-manager] 2025-09-27 00:32:55.941570 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:32:55.941576 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:32:55.941582 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:32:55.941588 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:32:55.941594 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:32:55.941601 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:32:55.941607 | orchestrator | 2025-09-27 00:32:55.941613 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:32:55.941621 | orchestrator | testbed-manager : ok=164  changed=38  unreachable=0 failed=0 skipped=42  rescued=0 ignored=0 2025-09-27 00:32:55.941628 | orchestrator | testbed-node-0 : ok=173  changed=67  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-09-27 00:32:55.941634 | orchestrator | testbed-node-1 : ok=173  changed=67  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-09-27 00:32:55.941641 | orchestrator | testbed-node-2 : ok=173  changed=67  unreachable=0 failed=0 skipped=36  rescued=0 ignored=0 2025-09-27 00:32:55.941647 | orchestrator | testbed-node-3 : ok=171  changed=63  unreachable=0 failed=0 skipped=38  rescued=0 ignored=0 2025-09-27 00:32:55.941653 | orchestrator | testbed-node-4 : ok=171  changed=63  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2025-09-27 00:32:55.941666 | orchestrator | testbed-node-5 : ok=171  changed=63  unreachable=0 failed=0 skipped=37  rescued=0 ignored=0 2025-09-27 00:32:55.941673 | orchestrator | 2025-09-27 00:32:55.941679 | orchestrator | 2025-09-27 00:32:55.941695 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:32:56.323945 | orchestrator | Saturday 27 September 2025 00:32:55 +0000 (0:00:01.272) 0:08:02.158 **** 2025-09-27 00:32:56.324050 | orchestrator | =============================================================================== 2025-09-27 00:32:56.324064 | orchestrator | osism.commons.packages : Install required packages --------------------- 73.36s 2025-09-27 00:32:56.324076 | orchestrator | osism.commons.packages : Download required packages -------------------- 39.05s 2025-09-27 00:32:56.324087 | orchestrator | osism.commons.cleanup : Cleanup installed packages --------------------- 34.30s 2025-09-27 00:32:56.324119 | orchestrator | osism.commons.repository : Update package cache ------------------------ 16.81s 2025-09-27 00:32:56.324130 | orchestrator | osism.commons.systohc : Install util-linux-extra package --------------- 12.46s 2025-09-27 00:32:56.324141 | orchestrator | osism.commons.packages : Remove dependencies that are no longer required -- 11.85s 2025-09-27 00:32:56.324153 | orchestrator | osism.services.docker : Install docker package ------------------------- 10.88s 2025-09-27 00:32:56.324163 | orchestrator | osism.services.docker : Install containerd package ---------------------- 9.88s 2025-09-27 00:32:56.324174 | orchestrator | osism.services.lldpd : Install lldpd package ---------------------------- 9.39s 2025-09-27 00:32:56.324185 | orchestrator | osism.services.docker : Install docker-cli package ---------------------- 8.99s 2025-09-27 00:32:56.324196 | orchestrator | osism.services.docker : Add repository ---------------------------------- 8.87s 2025-09-27 00:32:56.324262 | orchestrator | osism.services.smartd : Install smartmontools package ------------------- 8.47s 2025-09-27 00:32:56.324274 | orchestrator | osism.services.rng : Install rng package -------------------------------- 8.34s 2025-09-27 00:32:56.324285 | orchestrator | osism.commons.cleanup : Remove cloudinit package ------------------------ 7.97s 2025-09-27 00:32:56.324295 | orchestrator | osism.commons.cleanup : Uninstall unattended-upgrades package ----------- 7.88s 2025-09-27 00:32:56.324306 | orchestrator | osism.commons.docker_compose : Install docker-compose-plugin package ---- 7.06s 2025-09-27 00:32:56.324317 | orchestrator | osism.services.docker : Install apt-transport-https package ------------- 6.95s 2025-09-27 00:32:56.324327 | orchestrator | osism.commons.sysctl : Set sysctl parameters on rabbitmq ---------------- 5.91s 2025-09-27 00:32:56.324338 | orchestrator | osism.services.chrony : Populate service facts -------------------------- 5.84s 2025-09-27 00:32:56.324352 | orchestrator | osism.commons.cleanup : Remove dependencies that are no longer required --- 5.83s 2025-09-27 00:32:56.596363 | orchestrator | + [[ -e /etc/redhat-release ]] 2025-09-27 00:32:56.596453 | orchestrator | + osism apply network 2025-09-27 00:33:09.021579 | orchestrator | 2025-09-27 00:33:09 | INFO  | Task 90597477-44f0-48e5-af4c-c0f1bbc1768a (network) was prepared for execution. 2025-09-27 00:33:09.021694 | orchestrator | 2025-09-27 00:33:09 | INFO  | It takes a moment until task 90597477-44f0-48e5-af4c-c0f1bbc1768a (network) has been started and output is visible here. 2025-09-27 00:33:37.160279 | orchestrator | 2025-09-27 00:33:37.160398 | orchestrator | PLAY [Apply role network] ****************************************************** 2025-09-27 00:33:37.160415 | orchestrator | 2025-09-27 00:33:37.160427 | orchestrator | TASK [osism.commons.network : Gather variables for each operating system] ****** 2025-09-27 00:33:37.160439 | orchestrator | Saturday 27 September 2025 00:33:13 +0000 (0:00:00.268) 0:00:00.268 **** 2025-09-27 00:33:37.160450 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:37.160463 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:33:37.160473 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:33:37.160484 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:33:37.160495 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:33:37.160506 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:33:37.160516 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:33:37.160548 | orchestrator | 2025-09-27 00:33:37.160560 | orchestrator | TASK [osism.commons.network : Include type specific tasks] ********************* 2025-09-27 00:33:37.160571 | orchestrator | Saturday 27 September 2025 00:33:13 +0000 (0:00:00.698) 0:00:00.966 **** 2025-09-27 00:33:37.160591 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/netplan-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:33:37.160612 | orchestrator | 2025-09-27 00:33:37.160624 | orchestrator | TASK [osism.commons.network : Install required packages] *********************** 2025-09-27 00:33:37.160635 | orchestrator | Saturday 27 September 2025 00:33:15 +0000 (0:00:01.198) 0:00:02.165 **** 2025-09-27 00:33:37.160645 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:37.160657 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:33:37.160667 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:33:37.160678 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:33:37.160688 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:33:37.160699 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:33:37.160709 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:33:37.160720 | orchestrator | 2025-09-27 00:33:37.160731 | orchestrator | TASK [osism.commons.network : Remove ifupdown package] ************************* 2025-09-27 00:33:37.160742 | orchestrator | Saturday 27 September 2025 00:33:17 +0000 (0:00:02.028) 0:00:04.193 **** 2025-09-27 00:33:37.160752 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:37.160763 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:33:37.160774 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:33:37.160784 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:33:37.160794 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:33:37.160805 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:33:37.160815 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:33:37.160826 | orchestrator | 2025-09-27 00:33:37.160836 | orchestrator | TASK [osism.commons.network : Create required directories] ********************* 2025-09-27 00:33:37.160847 | orchestrator | Saturday 27 September 2025 00:33:18 +0000 (0:00:01.704) 0:00:05.898 **** 2025-09-27 00:33:37.160858 | orchestrator | ok: [testbed-manager] => (item=/etc/netplan) 2025-09-27 00:33:37.160869 | orchestrator | ok: [testbed-node-0] => (item=/etc/netplan) 2025-09-27 00:33:37.160880 | orchestrator | ok: [testbed-node-1] => (item=/etc/netplan) 2025-09-27 00:33:37.160891 | orchestrator | ok: [testbed-node-2] => (item=/etc/netplan) 2025-09-27 00:33:37.160901 | orchestrator | ok: [testbed-node-3] => (item=/etc/netplan) 2025-09-27 00:33:37.160912 | orchestrator | ok: [testbed-node-4] => (item=/etc/netplan) 2025-09-27 00:33:37.160923 | orchestrator | ok: [testbed-node-5] => (item=/etc/netplan) 2025-09-27 00:33:37.160934 | orchestrator | 2025-09-27 00:33:37.160945 | orchestrator | TASK [osism.commons.network : Prepare netplan configuration template] ********** 2025-09-27 00:33:37.160955 | orchestrator | Saturday 27 September 2025 00:33:19 +0000 (0:00:00.961) 0:00:06.859 **** 2025-09-27 00:33:37.160984 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-09-27 00:33:37.160997 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-09-27 00:33:37.161007 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-09-27 00:33:37.161018 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 00:33:37.161029 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-27 00:33:37.161039 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-09-27 00:33:37.161050 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-09-27 00:33:37.161060 | orchestrator | 2025-09-27 00:33:37.161071 | orchestrator | TASK [osism.commons.network : Copy netplan configuration] ********************** 2025-09-27 00:33:37.161081 | orchestrator | Saturday 27 September 2025 00:33:23 +0000 (0:00:03.313) 0:00:10.172 **** 2025-09-27 00:33:37.161092 | orchestrator | changed: [testbed-manager] 2025-09-27 00:33:37.161103 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:33:37.161114 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:33:37.161124 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:33:37.161134 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:33:37.161153 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:33:37.161164 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:33:37.161174 | orchestrator | 2025-09-27 00:33:37.161185 | orchestrator | TASK [osism.commons.network : Remove netplan configuration template] *********** 2025-09-27 00:33:37.161196 | orchestrator | Saturday 27 September 2025 00:33:24 +0000 (0:00:01.399) 0:00:11.572 **** 2025-09-27 00:33:37.161230 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-27 00:33:37.161241 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 00:33:37.161252 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-09-27 00:33:37.161262 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-09-27 00:33:37.161273 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-09-27 00:33:37.161284 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-09-27 00:33:37.161295 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-09-27 00:33:37.161305 | orchestrator | 2025-09-27 00:33:37.161316 | orchestrator | TASK [osism.commons.network : Check if path for interface file exists] ********* 2025-09-27 00:33:37.161327 | orchestrator | Saturday 27 September 2025 00:33:26 +0000 (0:00:01.914) 0:00:13.486 **** 2025-09-27 00:33:37.161338 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:37.161349 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:33:37.161359 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:33:37.161370 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:33:37.161381 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:33:37.161391 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:33:37.161402 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:33:37.161413 | orchestrator | 2025-09-27 00:33:37.161424 | orchestrator | TASK [osism.commons.network : Copy interfaces file] **************************** 2025-09-27 00:33:37.161455 | orchestrator | Saturday 27 September 2025 00:33:27 +0000 (0:00:01.062) 0:00:14.548 **** 2025-09-27 00:33:37.161467 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:33:37.161477 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:33:37.161488 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:33:37.161499 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:33:37.161510 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:33:37.161521 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:33:37.161531 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:33:37.161542 | orchestrator | 2025-09-27 00:33:37.161553 | orchestrator | TASK [osism.commons.network : Install package networkd-dispatcher] ************* 2025-09-27 00:33:37.161564 | orchestrator | Saturday 27 September 2025 00:33:28 +0000 (0:00:00.650) 0:00:15.199 **** 2025-09-27 00:33:37.161575 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:37.161586 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:33:37.161597 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:33:37.161608 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:33:37.161619 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:33:37.161629 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:33:37.161640 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:33:37.161651 | orchestrator | 2025-09-27 00:33:37.161662 | orchestrator | TASK [osism.commons.network : Copy dispatcher scripts] ************************* 2025-09-27 00:33:37.161672 | orchestrator | Saturday 27 September 2025 00:33:30 +0000 (0:00:02.192) 0:00:17.392 **** 2025-09-27 00:33:37.161683 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:33:37.161694 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:33:37.161705 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:33:37.161716 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:33:37.161727 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:33:37.161737 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:33:37.161749 | orchestrator | changed: [testbed-manager] => (item={'dest': 'routable.d/iptables.sh', 'src': '/opt/configuration/network/iptables.sh'}) 2025-09-27 00:33:37.161760 | orchestrator | 2025-09-27 00:33:37.161771 | orchestrator | TASK [osism.commons.network : Manage service networkd-dispatcher] ************** 2025-09-27 00:33:37.161782 | orchestrator | Saturday 27 September 2025 00:33:31 +0000 (0:00:00.880) 0:00:18.272 **** 2025-09-27 00:33:37.161793 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:37.161811 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:33:37.161822 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:33:37.161833 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:33:37.161843 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:33:37.161854 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:33:37.161865 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:33:37.161876 | orchestrator | 2025-09-27 00:33:37.161887 | orchestrator | TASK [osism.commons.network : Include cleanup tasks] *************************** 2025-09-27 00:33:37.161897 | orchestrator | Saturday 27 September 2025 00:33:33 +0000 (0:00:01.862) 0:00:20.135 **** 2025-09-27 00:33:37.161909 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-netplan.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:33:37.161922 | orchestrator | 2025-09-27 00:33:37.161933 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2025-09-27 00:33:37.161944 | orchestrator | Saturday 27 September 2025 00:33:34 +0000 (0:00:01.203) 0:00:21.339 **** 2025-09-27 00:33:37.161955 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:37.161966 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:33:37.161976 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:33:37.161987 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:33:37.161998 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:33:37.162009 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:33:37.162113 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:33:37.162129 | orchestrator | 2025-09-27 00:33:37.162140 | orchestrator | TASK [osism.commons.network : Set network_configured_files fact] *************** 2025-09-27 00:33:37.162151 | orchestrator | Saturday 27 September 2025 00:33:35 +0000 (0:00:00.939) 0:00:22.278 **** 2025-09-27 00:33:37.162162 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:37.162173 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:33:37.162184 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:33:37.162194 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:33:37.162223 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:33:37.162234 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:33:37.162245 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:33:37.162255 | orchestrator | 2025-09-27 00:33:37.162266 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2025-09-27 00:33:37.162277 | orchestrator | Saturday 27 September 2025 00:33:35 +0000 (0:00:00.770) 0:00:23.048 **** 2025-09-27 00:33:37.162288 | orchestrator | skipping: [testbed-manager] => (item=/etc/netplan/01-osism.yaml)  2025-09-27 00:33:37.162299 | orchestrator | skipping: [testbed-node-0] => (item=/etc/netplan/01-osism.yaml)  2025-09-27 00:33:37.162309 | orchestrator | skipping: [testbed-node-1] => (item=/etc/netplan/01-osism.yaml)  2025-09-27 00:33:37.162320 | orchestrator | skipping: [testbed-node-2] => (item=/etc/netplan/01-osism.yaml)  2025-09-27 00:33:37.162331 | orchestrator | changed: [testbed-manager] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-27 00:33:37.162341 | orchestrator | skipping: [testbed-node-3] => (item=/etc/netplan/01-osism.yaml)  2025-09-27 00:33:37.162352 | orchestrator | changed: [testbed-node-0] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-27 00:33:37.162362 | orchestrator | skipping: [testbed-node-4] => (item=/etc/netplan/01-osism.yaml)  2025-09-27 00:33:37.162373 | orchestrator | changed: [testbed-node-1] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-27 00:33:37.162384 | orchestrator | skipping: [testbed-node-5] => (item=/etc/netplan/01-osism.yaml)  2025-09-27 00:33:37.162394 | orchestrator | changed: [testbed-node-2] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-27 00:33:37.162405 | orchestrator | changed: [testbed-node-3] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-27 00:33:37.162415 | orchestrator | changed: [testbed-node-4] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-27 00:33:37.162426 | orchestrator | changed: [testbed-node-5] => (item=/etc/netplan/50-cloud-init.yaml) 2025-09-27 00:33:37.162437 | orchestrator | 2025-09-27 00:33:37.162457 | orchestrator | TASK [osism.commons.network : Include dummy interfaces] ************************ 2025-09-27 00:33:52.721949 | orchestrator | Saturday 27 September 2025 00:33:37 +0000 (0:00:01.155) 0:00:24.204 **** 2025-09-27 00:33:52.722116 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:33:52.722134 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:33:52.722145 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:33:52.722157 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:33:52.722167 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:33:52.722178 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:33:52.722189 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:33:52.722200 | orchestrator | 2025-09-27 00:33:52.722290 | orchestrator | TASK [osism.commons.network : Include vxlan interfaces] ************************ 2025-09-27 00:33:52.722301 | orchestrator | Saturday 27 September 2025 00:33:37 +0000 (0:00:00.621) 0:00:24.825 **** 2025-09-27 00:33:52.722313 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/vxlan-interfaces.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-4, testbed-node-3, testbed-node-5 2025-09-27 00:33:52.722328 | orchestrator | 2025-09-27 00:33:52.722339 | orchestrator | TASK [osism.commons.network : Create systemd networkd netdev files] ************ 2025-09-27 00:33:52.722350 | orchestrator | Saturday 27 September 2025 00:33:42 +0000 (0:00:04.700) 0:00:29.526 **** 2025-09-27 00:33:52.722364 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722376 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'addresses': ['192.168.112.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722401 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.10/20'], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722413 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722424 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722449 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722462 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722474 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722487 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722499 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.12/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722528 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.11/20'], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722558 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.14/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722570 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.13/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722581 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.15/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722592 | orchestrator | 2025-09-27 00:33:52.722604 | orchestrator | TASK [osism.commons.network : Create systemd networkd network files] *********** 2025-09-27 00:33:52.722615 | orchestrator | Saturday 27 September 2025 00:33:47 +0000 (0:00:05.172) 0:00:34.698 **** 2025-09-27 00:33:52.722626 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan0', 'value': {'addresses': ['192.168.112.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722638 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722650 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722662 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722674 | orchestrator | changed: [testbed-manager] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.5/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15'], 'local_ip': '192.168.16.5', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722686 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722703 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722714 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan0', 'value': {'addresses': [], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 42}}) 2025-09-27 00:33:52.722726 | orchestrator | changed: [testbed-node-0] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.10/20'], 'dests': ['192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.10', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722744 | orchestrator | changed: [testbed-node-3] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.13/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.13', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722756 | orchestrator | changed: [testbed-node-1] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.11/20'], 'dests': ['192.168.16.10', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.11', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722767 | orchestrator | changed: [testbed-node-2] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.12/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.13', '192.168.16.14', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.12', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:52.722790 | orchestrator | changed: [testbed-node-4] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.14/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.15', '192.168.16.5'], 'local_ip': '192.168.16.14', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:58.489564 | orchestrator | changed: [testbed-node-5] => (item={'key': 'vxlan1', 'value': {'addresses': ['192.168.128.15/20'], 'dests': ['192.168.16.10', '192.168.16.11', '192.168.16.12', '192.168.16.13', '192.168.16.14', '192.168.16.5'], 'local_ip': '192.168.16.15', 'mtu': 1350, 'vni': 23}}) 2025-09-27 00:33:58.489676 | orchestrator | 2025-09-27 00:33:58.489693 | orchestrator | TASK [osism.commons.network : Include networkd cleanup tasks] ****************** 2025-09-27 00:33:58.489707 | orchestrator | Saturday 27 September 2025 00:33:52 +0000 (0:00:05.069) 0:00:39.768 **** 2025-09-27 00:33:58.489720 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/commons/roles/network/tasks/cleanup-networkd.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:33:58.489732 | orchestrator | 2025-09-27 00:33:58.489743 | orchestrator | TASK [osism.commons.network : List existing configuration files] *************** 2025-09-27 00:33:58.489754 | orchestrator | Saturday 27 September 2025 00:33:53 +0000 (0:00:01.131) 0:00:40.900 **** 2025-09-27 00:33:58.489766 | orchestrator | ok: [testbed-manager] 2025-09-27 00:33:58.489778 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:33:58.489789 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:33:58.489799 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:33:58.489810 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:33:58.489821 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:33:58.489832 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:33:58.489842 | orchestrator | 2025-09-27 00:33:58.489854 | orchestrator | TASK [osism.commons.network : Remove unused configuration files] *************** 2025-09-27 00:33:58.489865 | orchestrator | Saturday 27 September 2025 00:33:54 +0000 (0:00:01.044) 0:00:41.944 **** 2025-09-27 00:33:58.489876 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-27 00:33:58.489887 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-27 00:33:58.489898 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-27 00:33:58.489909 | orchestrator | skipping: [testbed-manager] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-27 00:33:58.489920 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:33:58.489931 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-27 00:33:58.489942 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-27 00:33:58.489953 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-27 00:33:58.489987 | orchestrator | skipping: [testbed-node-0] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-27 00:33:58.489998 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:33:58.490009 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-27 00:33:58.490074 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-27 00:33:58.490101 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-27 00:33:58.490114 | orchestrator | skipping: [testbed-node-1] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-27 00:33:58.490126 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:33:58.490139 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-27 00:33:58.490151 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-27 00:33:58.490163 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-27 00:33:58.490175 | orchestrator | skipping: [testbed-node-2] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-27 00:33:58.490187 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-27 00:33:58.490200 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-27 00:33:58.490237 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-27 00:33:58.490253 | orchestrator | skipping: [testbed-node-3] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-27 00:33:58.490273 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:33:58.490291 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-27 00:33:58.490309 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-27 00:33:58.490327 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-27 00:33:58.490345 | orchestrator | skipping: [testbed-node-4] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-27 00:33:58.490362 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:33:58.490380 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:33:58.490401 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.network)  2025-09-27 00:33:58.490421 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.network)  2025-09-27 00:33:58.490439 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan1.netdev)  2025-09-27 00:33:58.490458 | orchestrator | skipping: [testbed-node-5] => (item=/etc/systemd/network/30-vxlan0.netdev)  2025-09-27 00:33:58.490476 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:33:58.490495 | orchestrator | 2025-09-27 00:33:58.490515 | orchestrator | RUNNING HANDLER [osism.commons.network : Reload systemd-networkd] ************** 2025-09-27 00:33:58.490561 | orchestrator | Saturday 27 September 2025 00:33:56 +0000 (0:00:01.951) 0:00:43.896 **** 2025-09-27 00:33:58.490584 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:33:58.490603 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:33:58.490623 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:33:58.490637 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:33:58.490654 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:33:58.490671 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:33:58.490688 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:33:58.490706 | orchestrator | 2025-09-27 00:33:58.490725 | orchestrator | RUNNING HANDLER [osism.commons.network : Netplan configuration changed] ******** 2025-09-27 00:33:58.490743 | orchestrator | Saturday 27 September 2025 00:33:57 +0000 (0:00:00.619) 0:00:44.515 **** 2025-09-27 00:33:58.490760 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:33:58.490781 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:33:58.490792 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:33:58.490803 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:33:58.490827 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:33:58.490838 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:33:58.490848 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:33:58.490859 | orchestrator | 2025-09-27 00:33:58.490870 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:33:58.490882 | orchestrator | testbed-manager : ok=21  changed=5  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 2025-09-27 00:33:58.490894 | orchestrator | testbed-node-0 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:33:58.490905 | orchestrator | testbed-node-1 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:33:58.490916 | orchestrator | testbed-node-2 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:33:58.490926 | orchestrator | testbed-node-3 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:33:58.490937 | orchestrator | testbed-node-4 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:33:58.490947 | orchestrator | testbed-node-5 : ok=20  changed=5  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2025-09-27 00:33:58.490958 | orchestrator | 2025-09-27 00:33:58.490969 | orchestrator | 2025-09-27 00:33:58.490980 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:33:58.490991 | orchestrator | Saturday 27 September 2025 00:33:58 +0000 (0:00:00.683) 0:00:45.198 **** 2025-09-27 00:33:58.491001 | orchestrator | =============================================================================== 2025-09-27 00:33:58.491020 | orchestrator | osism.commons.network : Create systemd networkd netdev files ------------ 5.17s 2025-09-27 00:33:58.491031 | orchestrator | osism.commons.network : Create systemd networkd network files ----------- 5.07s 2025-09-27 00:33:58.491042 | orchestrator | osism.commons.network : Include vxlan interfaces ------------------------ 4.70s 2025-09-27 00:33:58.491052 | orchestrator | osism.commons.network : Prepare netplan configuration template ---------- 3.31s 2025-09-27 00:33:58.491063 | orchestrator | osism.commons.network : Install package networkd-dispatcher ------------- 2.19s 2025-09-27 00:33:58.491073 | orchestrator | osism.commons.network : Install required packages ----------------------- 2.03s 2025-09-27 00:33:58.491084 | orchestrator | osism.commons.network : Remove unused configuration files --------------- 1.95s 2025-09-27 00:33:58.491094 | orchestrator | osism.commons.network : Remove netplan configuration template ----------- 1.91s 2025-09-27 00:33:58.491104 | orchestrator | osism.commons.network : Manage service networkd-dispatcher -------------- 1.86s 2025-09-27 00:33:58.491115 | orchestrator | osism.commons.network : Remove ifupdown package ------------------------- 1.70s 2025-09-27 00:33:58.491125 | orchestrator | osism.commons.network : Copy netplan configuration ---------------------- 1.40s 2025-09-27 00:33:58.491136 | orchestrator | osism.commons.network : Include cleanup tasks --------------------------- 1.20s 2025-09-27 00:33:58.491146 | orchestrator | osism.commons.network : Include type specific tasks --------------------- 1.20s 2025-09-27 00:33:58.491157 | orchestrator | osism.commons.network : Remove unused configuration files --------------- 1.16s 2025-09-27 00:33:58.491168 | orchestrator | osism.commons.network : Include networkd cleanup tasks ------------------ 1.13s 2025-09-27 00:33:58.491179 | orchestrator | osism.commons.network : Check if path for interface file exists --------- 1.06s 2025-09-27 00:33:58.491189 | orchestrator | osism.commons.network : List existing configuration files --------------- 1.04s 2025-09-27 00:33:58.491199 | orchestrator | osism.commons.network : Create required directories --------------------- 0.96s 2025-09-27 00:33:58.491236 | orchestrator | osism.commons.network : List existing configuration files --------------- 0.94s 2025-09-27 00:33:58.491255 | orchestrator | osism.commons.network : Copy dispatcher scripts ------------------------- 0.88s 2025-09-27 00:33:58.747965 | orchestrator | + osism apply wireguard 2025-09-27 00:34:10.778556 | orchestrator | 2025-09-27 00:34:10 | INFO  | Task 0917f1a9-5ddd-4902-921c-ecfd69edb63e (wireguard) was prepared for execution. 2025-09-27 00:34:10.778672 | orchestrator | 2025-09-27 00:34:10 | INFO  | It takes a moment until task 0917f1a9-5ddd-4902-921c-ecfd69edb63e (wireguard) has been started and output is visible here. 2025-09-27 00:34:29.158339 | orchestrator | 2025-09-27 00:34:29.158427 | orchestrator | PLAY [Apply role wireguard] **************************************************** 2025-09-27 00:34:29.158436 | orchestrator | 2025-09-27 00:34:29.158442 | orchestrator | TASK [osism.services.wireguard : Install iptables package] ********************* 2025-09-27 00:34:29.158448 | orchestrator | Saturday 27 September 2025 00:34:14 +0000 (0:00:00.219) 0:00:00.219 **** 2025-09-27 00:34:29.158454 | orchestrator | ok: [testbed-manager] 2025-09-27 00:34:29.158460 | orchestrator | 2025-09-27 00:34:29.158466 | orchestrator | TASK [osism.services.wireguard : Install wireguard package] ******************** 2025-09-27 00:34:29.158472 | orchestrator | Saturday 27 September 2025 00:34:16 +0000 (0:00:01.481) 0:00:01.701 **** 2025-09-27 00:34:29.158477 | orchestrator | changed: [testbed-manager] 2025-09-27 00:34:29.158483 | orchestrator | 2025-09-27 00:34:29.158488 | orchestrator | TASK [osism.services.wireguard : Create public and private key - server] ******* 2025-09-27 00:34:29.158493 | orchestrator | Saturday 27 September 2025 00:34:22 +0000 (0:00:06.237) 0:00:07.939 **** 2025-09-27 00:34:29.158498 | orchestrator | changed: [testbed-manager] 2025-09-27 00:34:29.158503 | orchestrator | 2025-09-27 00:34:29.158508 | orchestrator | TASK [osism.services.wireguard : Create preshared key] ************************* 2025-09-27 00:34:29.158514 | orchestrator | Saturday 27 September 2025 00:34:22 +0000 (0:00:00.527) 0:00:08.467 **** 2025-09-27 00:34:29.158519 | orchestrator | changed: [testbed-manager] 2025-09-27 00:34:29.158524 | orchestrator | 2025-09-27 00:34:29.158529 | orchestrator | TASK [osism.services.wireguard : Get preshared key] **************************** 2025-09-27 00:34:29.158534 | orchestrator | Saturday 27 September 2025 00:34:23 +0000 (0:00:00.420) 0:00:08.887 **** 2025-09-27 00:34:29.158539 | orchestrator | ok: [testbed-manager] 2025-09-27 00:34:29.158544 | orchestrator | 2025-09-27 00:34:29.158549 | orchestrator | TASK [osism.services.wireguard : Get public key - server] ********************** 2025-09-27 00:34:29.158554 | orchestrator | Saturday 27 September 2025 00:34:23 +0000 (0:00:00.485) 0:00:09.373 **** 2025-09-27 00:34:29.158560 | orchestrator | ok: [testbed-manager] 2025-09-27 00:34:29.158565 | orchestrator | 2025-09-27 00:34:29.158570 | orchestrator | TASK [osism.services.wireguard : Get private key - server] ********************* 2025-09-27 00:34:29.158575 | orchestrator | Saturday 27 September 2025 00:34:24 +0000 (0:00:00.427) 0:00:09.800 **** 2025-09-27 00:34:29.158580 | orchestrator | ok: [testbed-manager] 2025-09-27 00:34:29.158585 | orchestrator | 2025-09-27 00:34:29.158590 | orchestrator | TASK [osism.services.wireguard : Copy wg0.conf configuration file] ************* 2025-09-27 00:34:29.158596 | orchestrator | Saturday 27 September 2025 00:34:24 +0000 (0:00:00.354) 0:00:10.155 **** 2025-09-27 00:34:29.158601 | orchestrator | changed: [testbed-manager] 2025-09-27 00:34:29.158606 | orchestrator | 2025-09-27 00:34:29.158611 | orchestrator | TASK [osism.services.wireguard : Copy client configuration files] ************** 2025-09-27 00:34:29.158616 | orchestrator | Saturday 27 September 2025 00:34:25 +0000 (0:00:01.036) 0:00:11.191 **** 2025-09-27 00:34:29.158621 | orchestrator | changed: [testbed-manager] => (item=None) 2025-09-27 00:34:29.158627 | orchestrator | changed: [testbed-manager] 2025-09-27 00:34:29.158632 | orchestrator | 2025-09-27 00:34:29.158637 | orchestrator | TASK [osism.services.wireguard : Manage wg-quick@wg0.service service] ********** 2025-09-27 00:34:29.158642 | orchestrator | Saturday 27 September 2025 00:34:26 +0000 (0:00:00.815) 0:00:12.007 **** 2025-09-27 00:34:29.158647 | orchestrator | changed: [testbed-manager] 2025-09-27 00:34:29.158652 | orchestrator | 2025-09-27 00:34:29.158670 | orchestrator | RUNNING HANDLER [osism.services.wireguard : Restart wg0 service] *************** 2025-09-27 00:34:29.158695 | orchestrator | Saturday 27 September 2025 00:34:28 +0000 (0:00:01.506) 0:00:13.514 **** 2025-09-27 00:34:29.158701 | orchestrator | changed: [testbed-manager] 2025-09-27 00:34:29.158706 | orchestrator | 2025-09-27 00:34:29.158711 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:34:29.158716 | orchestrator | testbed-manager : ok=11  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:34:29.158722 | orchestrator | 2025-09-27 00:34:29.158727 | orchestrator | 2025-09-27 00:34:29.158732 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:34:29.158737 | orchestrator | Saturday 27 September 2025 00:34:28 +0000 (0:00:00.926) 0:00:14.440 **** 2025-09-27 00:34:29.158742 | orchestrator | =============================================================================== 2025-09-27 00:34:29.158747 | orchestrator | osism.services.wireguard : Install wireguard package -------------------- 6.24s 2025-09-27 00:34:29.158752 | orchestrator | osism.services.wireguard : Manage wg-quick@wg0.service service ---------- 1.51s 2025-09-27 00:34:29.158758 | orchestrator | osism.services.wireguard : Install iptables package --------------------- 1.48s 2025-09-27 00:34:29.158763 | orchestrator | osism.services.wireguard : Copy wg0.conf configuration file ------------- 1.04s 2025-09-27 00:34:29.158768 | orchestrator | osism.services.wireguard : Restart wg0 service -------------------------- 0.93s 2025-09-27 00:34:29.158773 | orchestrator | osism.services.wireguard : Copy client configuration files -------------- 0.82s 2025-09-27 00:34:29.158778 | orchestrator | osism.services.wireguard : Create public and private key - server ------- 0.53s 2025-09-27 00:34:29.158783 | orchestrator | osism.services.wireguard : Get preshared key ---------------------------- 0.49s 2025-09-27 00:34:29.158788 | orchestrator | osism.services.wireguard : Get public key - server ---------------------- 0.43s 2025-09-27 00:34:29.158793 | orchestrator | osism.services.wireguard : Create preshared key ------------------------- 0.42s 2025-09-27 00:34:29.158798 | orchestrator | osism.services.wireguard : Get private key - server --------------------- 0.35s 2025-09-27 00:34:29.365823 | orchestrator | + sh -c /opt/configuration/scripts/prepare-wireguard-configuration.sh 2025-09-27 00:34:29.406551 | orchestrator | % Total % Received % Xferd Average Speed Time Time Time Current 2025-09-27 00:34:29.406597 | orchestrator | Dload Upload Total Spent Left Speed 2025-09-27 00:34:29.485875 | orchestrator | 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 14 100 14 0 0 175 0 --:--:-- --:--:-- --:--:-- 177 2025-09-27 00:34:29.498441 | orchestrator | + osism apply --environment custom workarounds 2025-09-27 00:34:31.172371 | orchestrator | 2025-09-27 00:34:31 | INFO  | Trying to run play workarounds in environment custom 2025-09-27 00:34:41.272586 | orchestrator | 2025-09-27 00:34:41 | INFO  | Task 1837920e-3730-44fe-b93c-b3c6a28f4dd5 (workarounds) was prepared for execution. 2025-09-27 00:34:41.272704 | orchestrator | 2025-09-27 00:34:41 | INFO  | It takes a moment until task 1837920e-3730-44fe-b93c-b3c6a28f4dd5 (workarounds) has been started and output is visible here. 2025-09-27 00:35:04.875363 | orchestrator | 2025-09-27 00:35:04.875465 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:35:04.875482 | orchestrator | 2025-09-27 00:35:04.875494 | orchestrator | TASK [Group hosts based on virtualization_role] ******************************** 2025-09-27 00:35:04.875506 | orchestrator | Saturday 27 September 2025 00:34:45 +0000 (0:00:00.145) 0:00:00.145 **** 2025-09-27 00:35:04.875517 | orchestrator | changed: [testbed-manager] => (item=virtualization_role_guest) 2025-09-27 00:35:04.875528 | orchestrator | changed: [testbed-node-0] => (item=virtualization_role_guest) 2025-09-27 00:35:04.875539 | orchestrator | changed: [testbed-node-1] => (item=virtualization_role_guest) 2025-09-27 00:35:04.875549 | orchestrator | changed: [testbed-node-2] => (item=virtualization_role_guest) 2025-09-27 00:35:04.875560 | orchestrator | changed: [testbed-node-3] => (item=virtualization_role_guest) 2025-09-27 00:35:04.875593 | orchestrator | changed: [testbed-node-4] => (item=virtualization_role_guest) 2025-09-27 00:35:04.875605 | orchestrator | changed: [testbed-node-5] => (item=virtualization_role_guest) 2025-09-27 00:35:04.875616 | orchestrator | 2025-09-27 00:35:04.875627 | orchestrator | PLAY [Apply netplan configuration on the manager node] ************************* 2025-09-27 00:35:04.875637 | orchestrator | 2025-09-27 00:35:04.875648 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2025-09-27 00:35:04.875659 | orchestrator | Saturday 27 September 2025 00:34:45 +0000 (0:00:00.602) 0:00:00.748 **** 2025-09-27 00:35:04.875670 | orchestrator | ok: [testbed-manager] 2025-09-27 00:35:04.875681 | orchestrator | 2025-09-27 00:35:04.875692 | orchestrator | PLAY [Apply netplan configuration on all other nodes] ************************** 2025-09-27 00:35:04.875703 | orchestrator | 2025-09-27 00:35:04.875713 | orchestrator | TASK [Apply netplan configuration] ********************************************* 2025-09-27 00:35:04.875724 | orchestrator | Saturday 27 September 2025 00:34:47 +0000 (0:00:01.948) 0:00:02.696 **** 2025-09-27 00:35:04.875735 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:35:04.875746 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:35:04.875756 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:35:04.875767 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:35:04.875777 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:35:04.875788 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:35:04.875798 | orchestrator | 2025-09-27 00:35:04.875809 | orchestrator | PLAY [Add custom CA certificates to non-manager nodes] ************************* 2025-09-27 00:35:04.875820 | orchestrator | 2025-09-27 00:35:04.875830 | orchestrator | TASK [Copy custom CA certificates] ********************************************* 2025-09-27 00:35:04.875841 | orchestrator | Saturday 27 September 2025 00:34:49 +0000 (0:00:01.671) 0:00:04.367 **** 2025-09-27 00:35:04.875864 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-27 00:35:04.875879 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-27 00:35:04.875892 | orchestrator | changed: [testbed-node-4] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-27 00:35:04.875905 | orchestrator | changed: [testbed-node-3] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-27 00:35:04.875917 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-27 00:35:04.875930 | orchestrator | changed: [testbed-node-5] => (item=/opt/configuration/environments/kolla/certificates/ca/testbed.crt) 2025-09-27 00:35:04.875942 | orchestrator | 2025-09-27 00:35:04.875955 | orchestrator | TASK [Run update-ca-certificates] ********************************************** 2025-09-27 00:35:04.875968 | orchestrator | Saturday 27 September 2025 00:34:50 +0000 (0:00:01.245) 0:00:05.613 **** 2025-09-27 00:35:04.875981 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:35:04.875994 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:35:04.876006 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:35:04.876018 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:35:04.876031 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:35:04.876044 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:35:04.876056 | orchestrator | 2025-09-27 00:35:04.876069 | orchestrator | TASK [Run update-ca-trust] ***************************************************** 2025-09-27 00:35:04.876081 | orchestrator | Saturday 27 September 2025 00:34:54 +0000 (0:00:03.538) 0:00:09.151 **** 2025-09-27 00:35:04.876094 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:35:04.876107 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:35:04.876118 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:35:04.876132 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:35:04.876145 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:35:04.876158 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:35:04.876170 | orchestrator | 2025-09-27 00:35:04.876183 | orchestrator | PLAY [Add a workaround service] ************************************************ 2025-09-27 00:35:04.876256 | orchestrator | 2025-09-27 00:35:04.876270 | orchestrator | TASK [Copy workarounds.sh scripts] ********************************************* 2025-09-27 00:35:04.876281 | orchestrator | Saturday 27 September 2025 00:34:54 +0000 (0:00:00.704) 0:00:09.855 **** 2025-09-27 00:35:04.876292 | orchestrator | changed: [testbed-manager] 2025-09-27 00:35:04.876303 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:35:04.876314 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:35:04.876325 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:35:04.876336 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:35:04.876346 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:35:04.876357 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:35:04.876368 | orchestrator | 2025-09-27 00:35:04.876379 | orchestrator | TASK [Copy workarounds systemd unit file] ************************************** 2025-09-27 00:35:04.876390 | orchestrator | Saturday 27 September 2025 00:34:56 +0000 (0:00:01.678) 0:00:11.534 **** 2025-09-27 00:35:04.876401 | orchestrator | changed: [testbed-manager] 2025-09-27 00:35:04.876412 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:35:04.876422 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:35:04.876433 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:35:04.876444 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:35:04.876455 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:35:04.876482 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:35:04.876493 | orchestrator | 2025-09-27 00:35:04.876504 | orchestrator | TASK [Reload systemd daemon] *************************************************** 2025-09-27 00:35:04.876515 | orchestrator | Saturday 27 September 2025 00:34:58 +0000 (0:00:01.651) 0:00:13.185 **** 2025-09-27 00:35:04.876526 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:35:04.876537 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:35:04.876548 | orchestrator | ok: [testbed-manager] 2025-09-27 00:35:04.876559 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:35:04.876569 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:35:04.876580 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:35:04.876591 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:35:04.876601 | orchestrator | 2025-09-27 00:35:04.876612 | orchestrator | TASK [Enable workarounds.service (Debian)] ************************************* 2025-09-27 00:35:04.876623 | orchestrator | Saturday 27 September 2025 00:34:59 +0000 (0:00:01.484) 0:00:14.670 **** 2025-09-27 00:35:04.876634 | orchestrator | changed: [testbed-manager] 2025-09-27 00:35:04.876645 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:35:04.876655 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:35:04.876666 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:35:04.876677 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:35:04.876687 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:35:04.876698 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:35:04.876709 | orchestrator | 2025-09-27 00:35:04.876720 | orchestrator | TASK [Enable and start workarounds.service (RedHat)] *************************** 2025-09-27 00:35:04.876730 | orchestrator | Saturday 27 September 2025 00:35:01 +0000 (0:00:01.853) 0:00:16.524 **** 2025-09-27 00:35:04.876741 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:35:04.876752 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:35:04.876763 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:35:04.876773 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:35:04.876784 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:35:04.876795 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:35:04.876805 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:35:04.876816 | orchestrator | 2025-09-27 00:35:04.876827 | orchestrator | PLAY [On Ubuntu 24.04 install python3-docker from Debian Sid] ****************** 2025-09-27 00:35:04.876838 | orchestrator | 2025-09-27 00:35:04.876849 | orchestrator | TASK [Install python3-docker] ************************************************** 2025-09-27 00:35:04.876860 | orchestrator | Saturday 27 September 2025 00:35:02 +0000 (0:00:00.602) 0:00:17.126 **** 2025-09-27 00:35:04.876871 | orchestrator | ok: [testbed-manager] 2025-09-27 00:35:04.876881 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:35:04.876900 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:35:04.876915 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:35:04.876926 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:35:04.876937 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:35:04.876948 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:35:04.876959 | orchestrator | 2025-09-27 00:35:04.876970 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:35:04.876982 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:35:04.876993 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:04.877004 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:04.877015 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:04.877026 | orchestrator | testbed-node-3 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:04.877037 | orchestrator | testbed-node-4 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:04.877048 | orchestrator | testbed-node-5 : ok=9  changed=6  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:04.877059 | orchestrator | 2025-09-27 00:35:04.877070 | orchestrator | 2025-09-27 00:35:04.877081 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:35:04.877092 | orchestrator | Saturday 27 September 2025 00:35:04 +0000 (0:00:02.742) 0:00:19.869 **** 2025-09-27 00:35:04.877103 | orchestrator | =============================================================================== 2025-09-27 00:35:04.877113 | orchestrator | Run update-ca-certificates ---------------------------------------------- 3.54s 2025-09-27 00:35:04.877124 | orchestrator | Install python3-docker -------------------------------------------------- 2.74s 2025-09-27 00:35:04.877135 | orchestrator | Apply netplan configuration --------------------------------------------- 1.95s 2025-09-27 00:35:04.877145 | orchestrator | Enable workarounds.service (Debian) ------------------------------------- 1.85s 2025-09-27 00:35:04.877156 | orchestrator | Copy workarounds.sh scripts --------------------------------------------- 1.68s 2025-09-27 00:35:04.877167 | orchestrator | Apply netplan configuration --------------------------------------------- 1.67s 2025-09-27 00:35:04.877177 | orchestrator | Copy workarounds systemd unit file -------------------------------------- 1.65s 2025-09-27 00:35:04.877188 | orchestrator | Reload systemd daemon --------------------------------------------------- 1.48s 2025-09-27 00:35:04.877199 | orchestrator | Copy custom CA certificates --------------------------------------------- 1.25s 2025-09-27 00:35:04.877226 | orchestrator | Run update-ca-trust ----------------------------------------------------- 0.70s 2025-09-27 00:35:04.877245 | orchestrator | Enable and start workarounds.service (RedHat) --------------------------- 0.60s 2025-09-27 00:35:04.877264 | orchestrator | Group hosts based on virtualization_role -------------------------------- 0.60s 2025-09-27 00:35:05.452923 | orchestrator | + osism apply reboot -l testbed-nodes -e ireallymeanit=yes 2025-09-27 00:35:17.420983 | orchestrator | 2025-09-27 00:35:17 | INFO  | Task 3b7916f0-3c90-4ccd-aeda-a4d14f3833d7 (reboot) was prepared for execution. 2025-09-27 00:35:17.421097 | orchestrator | 2025-09-27 00:35:17 | INFO  | It takes a moment until task 3b7916f0-3c90-4ccd-aeda-a4d14f3833d7 (reboot) has been started and output is visible here. 2025-09-27 00:35:27.080961 | orchestrator | 2025-09-27 00:35:27.081065 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-27 00:35:27.081080 | orchestrator | 2025-09-27 00:35:27.081114 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-27 00:35:27.081125 | orchestrator | Saturday 27 September 2025 00:35:21 +0000 (0:00:00.205) 0:00:00.205 **** 2025-09-27 00:35:27.081135 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:35:27.081146 | orchestrator | 2025-09-27 00:35:27.081155 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-27 00:35:27.081165 | orchestrator | Saturday 27 September 2025 00:35:21 +0000 (0:00:00.102) 0:00:00.308 **** 2025-09-27 00:35:27.081175 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:35:27.081185 | orchestrator | 2025-09-27 00:35:27.081194 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-27 00:35:27.081245 | orchestrator | Saturday 27 September 2025 00:35:22 +0000 (0:00:00.910) 0:00:01.218 **** 2025-09-27 00:35:27.081256 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:35:27.081266 | orchestrator | 2025-09-27 00:35:27.081275 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-27 00:35:27.081285 | orchestrator | 2025-09-27 00:35:27.081294 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-27 00:35:27.081304 | orchestrator | Saturday 27 September 2025 00:35:22 +0000 (0:00:00.116) 0:00:01.334 **** 2025-09-27 00:35:27.081313 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:35:27.081323 | orchestrator | 2025-09-27 00:35:27.081332 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-27 00:35:27.081342 | orchestrator | Saturday 27 September 2025 00:35:22 +0000 (0:00:00.104) 0:00:01.439 **** 2025-09-27 00:35:27.081351 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:35:27.081361 | orchestrator | 2025-09-27 00:35:27.081370 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-27 00:35:27.081395 | orchestrator | Saturday 27 September 2025 00:35:23 +0000 (0:00:00.639) 0:00:02.079 **** 2025-09-27 00:35:27.081405 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:35:27.081414 | orchestrator | 2025-09-27 00:35:27.081424 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-27 00:35:27.081433 | orchestrator | 2025-09-27 00:35:27.081443 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-27 00:35:27.081452 | orchestrator | Saturday 27 September 2025 00:35:23 +0000 (0:00:00.108) 0:00:02.187 **** 2025-09-27 00:35:27.081462 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:35:27.081471 | orchestrator | 2025-09-27 00:35:27.081481 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-27 00:35:27.081490 | orchestrator | Saturday 27 September 2025 00:35:23 +0000 (0:00:00.207) 0:00:02.394 **** 2025-09-27 00:35:27.081500 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:35:27.081511 | orchestrator | 2025-09-27 00:35:27.081523 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-27 00:35:27.081534 | orchestrator | Saturday 27 September 2025 00:35:24 +0000 (0:00:00.624) 0:00:03.019 **** 2025-09-27 00:35:27.081545 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:35:27.081556 | orchestrator | 2025-09-27 00:35:27.081567 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-27 00:35:27.081578 | orchestrator | 2025-09-27 00:35:27.081589 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-27 00:35:27.081601 | orchestrator | Saturday 27 September 2025 00:35:24 +0000 (0:00:00.121) 0:00:03.141 **** 2025-09-27 00:35:27.081611 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:35:27.081622 | orchestrator | 2025-09-27 00:35:27.081633 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-27 00:35:27.081644 | orchestrator | Saturday 27 September 2025 00:35:24 +0000 (0:00:00.103) 0:00:03.244 **** 2025-09-27 00:35:27.081655 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:35:27.081666 | orchestrator | 2025-09-27 00:35:27.081678 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-27 00:35:27.081689 | orchestrator | Saturday 27 September 2025 00:35:25 +0000 (0:00:00.641) 0:00:03.886 **** 2025-09-27 00:35:27.081710 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:35:27.081721 | orchestrator | 2025-09-27 00:35:27.081733 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-27 00:35:27.081743 | orchestrator | 2025-09-27 00:35:27.081754 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-27 00:35:27.081765 | orchestrator | Saturday 27 September 2025 00:35:25 +0000 (0:00:00.117) 0:00:04.003 **** 2025-09-27 00:35:27.081776 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:35:27.081787 | orchestrator | 2025-09-27 00:35:27.081798 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-27 00:35:27.081809 | orchestrator | Saturday 27 September 2025 00:35:25 +0000 (0:00:00.105) 0:00:04.108 **** 2025-09-27 00:35:27.081820 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:35:27.081831 | orchestrator | 2025-09-27 00:35:27.081842 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-27 00:35:27.081853 | orchestrator | Saturday 27 September 2025 00:35:25 +0000 (0:00:00.649) 0:00:04.758 **** 2025-09-27 00:35:27.081864 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:35:27.081873 | orchestrator | 2025-09-27 00:35:27.081883 | orchestrator | PLAY [Reboot systems] ********************************************************** 2025-09-27 00:35:27.081892 | orchestrator | 2025-09-27 00:35:27.081902 | orchestrator | TASK [Exit playbook, if user did not mean to reboot systems] ******************* 2025-09-27 00:35:27.081911 | orchestrator | Saturday 27 September 2025 00:35:26 +0000 (0:00:00.106) 0:00:04.865 **** 2025-09-27 00:35:27.081920 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:35:27.081930 | orchestrator | 2025-09-27 00:35:27.081939 | orchestrator | TASK [Reboot system - do not wait for the reboot to complete] ****************** 2025-09-27 00:35:27.081949 | orchestrator | Saturday 27 September 2025 00:35:26 +0000 (0:00:00.100) 0:00:04.965 **** 2025-09-27 00:35:27.081959 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:35:27.081968 | orchestrator | 2025-09-27 00:35:27.081977 | orchestrator | TASK [Reboot system - wait for the reboot to complete] ************************* 2025-09-27 00:35:27.081987 | orchestrator | Saturday 27 September 2025 00:35:26 +0000 (0:00:00.634) 0:00:05.600 **** 2025-09-27 00:35:27.082012 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:35:27.082078 | orchestrator | 2025-09-27 00:35:27.082088 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:35:27.082099 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:27.082110 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:27.082120 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:27.082130 | orchestrator | testbed-node-3 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:27.082140 | orchestrator | testbed-node-4 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:27.082149 | orchestrator | testbed-node-5 : ok=1  changed=1  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:35:27.082159 | orchestrator | 2025-09-27 00:35:27.082169 | orchestrator | 2025-09-27 00:35:27.082178 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:35:27.082188 | orchestrator | Saturday 27 September 2025 00:35:26 +0000 (0:00:00.035) 0:00:05.636 **** 2025-09-27 00:35:27.082198 | orchestrator | =============================================================================== 2025-09-27 00:35:27.082235 | orchestrator | Reboot system - do not wait for the reboot to complete ------------------ 4.10s 2025-09-27 00:35:27.082245 | orchestrator | Exit playbook, if user did not mean to reboot systems ------------------- 0.72s 2025-09-27 00:35:27.082262 | orchestrator | Reboot system - wait for the reboot to complete ------------------------- 0.61s 2025-09-27 00:35:27.342318 | orchestrator | + osism apply wait-for-connection -l testbed-nodes -e ireallymeanit=yes 2025-09-27 00:35:39.263619 | orchestrator | 2025-09-27 00:35:39 | INFO  | Task 8327b958-440a-4f1c-8a51-b100109119e2 (wait-for-connection) was prepared for execution. 2025-09-27 00:35:39.263726 | orchestrator | 2025-09-27 00:35:39 | INFO  | It takes a moment until task 8327b958-440a-4f1c-8a51-b100109119e2 (wait-for-connection) has been started and output is visible here. 2025-09-27 00:35:55.137800 | orchestrator | 2025-09-27 00:35:55.137916 | orchestrator | PLAY [Wait until remote systems are reachable] ********************************* 2025-09-27 00:35:55.137933 | orchestrator | 2025-09-27 00:35:55.137946 | orchestrator | TASK [Wait until remote system is reachable] *********************************** 2025-09-27 00:35:55.137958 | orchestrator | Saturday 27 September 2025 00:35:43 +0000 (0:00:00.246) 0:00:00.246 **** 2025-09-27 00:35:55.137970 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:35:55.137982 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:35:55.137993 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:35:55.138003 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:35:55.138081 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:35:55.138094 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:35:55.138105 | orchestrator | 2025-09-27 00:35:55.138117 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:35:55.138128 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:35:55.138141 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:35:55.138153 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:35:55.138163 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:35:55.138174 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:35:55.138185 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:35:55.138196 | orchestrator | 2025-09-27 00:35:55.138249 | orchestrator | 2025-09-27 00:35:55.138261 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:35:55.138272 | orchestrator | Saturday 27 September 2025 00:35:54 +0000 (0:00:11.574) 0:00:11.820 **** 2025-09-27 00:35:55.138283 | orchestrator | =============================================================================== 2025-09-27 00:35:55.138294 | orchestrator | Wait until remote system is reachable ---------------------------------- 11.57s 2025-09-27 00:35:55.391654 | orchestrator | + osism apply hddtemp 2025-09-27 00:36:07.455693 | orchestrator | 2025-09-27 00:36:07 | INFO  | Task b7781b1f-3603-49ed-8780-b58199c6c41d (hddtemp) was prepared for execution. 2025-09-27 00:36:07.455810 | orchestrator | 2025-09-27 00:36:07 | INFO  | It takes a moment until task b7781b1f-3603-49ed-8780-b58199c6c41d (hddtemp) has been started and output is visible here. 2025-09-27 00:36:35.900689 | orchestrator | 2025-09-27 00:36:35.900832 | orchestrator | PLAY [Apply role hddtemp] ****************************************************** 2025-09-27 00:36:35.900851 | orchestrator | 2025-09-27 00:36:35.900863 | orchestrator | TASK [osism.services.hddtemp : Gather variables for each operating system] ***** 2025-09-27 00:36:35.900875 | orchestrator | Saturday 27 September 2025 00:36:11 +0000 (0:00:00.274) 0:00:00.274 **** 2025-09-27 00:36:35.900923 | orchestrator | ok: [testbed-manager] 2025-09-27 00:36:35.900938 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:36:35.900950 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:36:35.900987 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:36:35.900998 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:36:35.901009 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:36:35.901020 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:36:35.901031 | orchestrator | 2025-09-27 00:36:35.901043 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific install tasks] **** 2025-09-27 00:36:35.901054 | orchestrator | Saturday 27 September 2025 00:36:12 +0000 (0:00:00.688) 0:00:00.962 **** 2025-09-27 00:36:35.901066 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:36:35.901079 | orchestrator | 2025-09-27 00:36:35.901091 | orchestrator | TASK [osism.services.hddtemp : Remove hddtemp package] ************************* 2025-09-27 00:36:35.901102 | orchestrator | Saturday 27 September 2025 00:36:13 +0000 (0:00:01.171) 0:00:02.134 **** 2025-09-27 00:36:35.901112 | orchestrator | ok: [testbed-manager] 2025-09-27 00:36:35.901123 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:36:35.901134 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:36:35.901145 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:36:35.901156 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:36:35.901167 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:36:35.901177 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:36:35.901188 | orchestrator | 2025-09-27 00:36:35.901199 | orchestrator | TASK [osism.services.hddtemp : Enable Kernel Module drivetemp] ***************** 2025-09-27 00:36:35.901261 | orchestrator | Saturday 27 September 2025 00:36:15 +0000 (0:00:02.094) 0:00:04.229 **** 2025-09-27 00:36:35.901275 | orchestrator | changed: [testbed-manager] 2025-09-27 00:36:35.901289 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:36:35.901301 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:36:35.901314 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:36:35.901327 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:36:35.901339 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:36:35.901351 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:36:35.901364 | orchestrator | 2025-09-27 00:36:35.901376 | orchestrator | TASK [osism.services.hddtemp : Check if drivetemp module is available] ********* 2025-09-27 00:36:35.901389 | orchestrator | Saturday 27 September 2025 00:36:16 +0000 (0:00:01.130) 0:00:05.360 **** 2025-09-27 00:36:35.901401 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:36:35.901414 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:36:35.901427 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:36:35.901439 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:36:35.901451 | orchestrator | ok: [testbed-manager] 2025-09-27 00:36:35.901463 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:36:35.901475 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:36:35.901487 | orchestrator | 2025-09-27 00:36:35.901499 | orchestrator | TASK [osism.services.hddtemp : Load Kernel Module drivetemp] ******************* 2025-09-27 00:36:35.901511 | orchestrator | Saturday 27 September 2025 00:36:17 +0000 (0:00:01.151) 0:00:06.511 **** 2025-09-27 00:36:35.901524 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:36:35.901536 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:36:35.901549 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:36:35.901613 | orchestrator | changed: [testbed-manager] 2025-09-27 00:36:35.901627 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:36:35.901638 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:36:35.901649 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:36:35.901660 | orchestrator | 2025-09-27 00:36:35.901671 | orchestrator | TASK [osism.services.hddtemp : Install lm-sensors] ***************************** 2025-09-27 00:36:35.901682 | orchestrator | Saturday 27 September 2025 00:36:18 +0000 (0:00:00.805) 0:00:07.317 **** 2025-09-27 00:36:35.901692 | orchestrator | changed: [testbed-manager] 2025-09-27 00:36:35.901703 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:36:35.901714 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:36:35.901725 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:36:35.901735 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:36:35.901758 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:36:35.901769 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:36:35.901780 | orchestrator | 2025-09-27 00:36:35.901791 | orchestrator | TASK [osism.services.hddtemp : Include distribution specific service tasks] **** 2025-09-27 00:36:35.901802 | orchestrator | Saturday 27 September 2025 00:36:32 +0000 (0:00:13.699) 0:00:21.017 **** 2025-09-27 00:36:35.901813 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/hddtemp/tasks/service-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:36:35.901824 | orchestrator | 2025-09-27 00:36:35.901835 | orchestrator | TASK [osism.services.hddtemp : Manage lm-sensors service] ********************** 2025-09-27 00:36:35.901846 | orchestrator | Saturday 27 September 2025 00:36:33 +0000 (0:00:01.417) 0:00:22.434 **** 2025-09-27 00:36:35.901857 | orchestrator | changed: [testbed-manager] 2025-09-27 00:36:35.901868 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:36:35.901878 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:36:35.901889 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:36:35.901900 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:36:35.901911 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:36:35.901921 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:36:35.901932 | orchestrator | 2025-09-27 00:36:35.901943 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:36:35.901954 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:36:35.901986 | orchestrator | testbed-node-0 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:36:35.901998 | orchestrator | testbed-node-1 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:36:35.902010 | orchestrator | testbed-node-2 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:36:35.902082 | orchestrator | testbed-node-3 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:36:35.902094 | orchestrator | testbed-node-4 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:36:35.902105 | orchestrator | testbed-node-5 : ok=8  changed=3  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:36:35.902116 | orchestrator | 2025-09-27 00:36:35.902127 | orchestrator | 2025-09-27 00:36:35.902139 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:36:35.902150 | orchestrator | Saturday 27 September 2025 00:36:35 +0000 (0:00:01.990) 0:00:24.424 **** 2025-09-27 00:36:35.902161 | orchestrator | =============================================================================== 2025-09-27 00:36:35.902172 | orchestrator | osism.services.hddtemp : Install lm-sensors ---------------------------- 13.70s 2025-09-27 00:36:35.902182 | orchestrator | osism.services.hddtemp : Remove hddtemp package ------------------------- 2.09s 2025-09-27 00:36:35.902194 | orchestrator | osism.services.hddtemp : Manage lm-sensors service ---------------------- 1.99s 2025-09-27 00:36:35.902229 | orchestrator | osism.services.hddtemp : Include distribution specific service tasks ---- 1.42s 2025-09-27 00:36:35.902242 | orchestrator | osism.services.hddtemp : Include distribution specific install tasks ---- 1.17s 2025-09-27 00:36:35.902253 | orchestrator | osism.services.hddtemp : Check if drivetemp module is available --------- 1.15s 2025-09-27 00:36:35.902264 | orchestrator | osism.services.hddtemp : Enable Kernel Module drivetemp ----------------- 1.13s 2025-09-27 00:36:35.902275 | orchestrator | osism.services.hddtemp : Load Kernel Module drivetemp ------------------- 0.81s 2025-09-27 00:36:35.902295 | orchestrator | osism.services.hddtemp : Gather variables for each operating system ----- 0.69s 2025-09-27 00:36:36.151416 | orchestrator | ++ semver latest 7.1.1 2025-09-27 00:36:36.212700 | orchestrator | + [[ -1 -ge 0 ]] 2025-09-27 00:36:36.212730 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-27 00:36:36.212743 | orchestrator | + sudo systemctl restart manager.service 2025-09-27 00:36:50.164957 | orchestrator | + [[ ceph-ansible == \c\e\p\h\-\a\n\s\i\b\l\e ]] 2025-09-27 00:36:50.165085 | orchestrator | + wait_for_container_healthy 60 ceph-ansible 2025-09-27 00:36:50.165104 | orchestrator | + local max_attempts=60 2025-09-27 00:36:50.165116 | orchestrator | + local name=ceph-ansible 2025-09-27 00:36:50.165127 | orchestrator | + local attempt_num=1 2025-09-27 00:36:50.165139 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:36:50.199994 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:36:50.200056 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:36:50.200069 | orchestrator | + sleep 5 2025-09-27 00:36:55.203515 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:36:55.231514 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:36:55.231602 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:36:55.231617 | orchestrator | + sleep 5 2025-09-27 00:37:00.234750 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:00.277933 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:00.277967 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:00.277981 | orchestrator | + sleep 5 2025-09-27 00:37:05.283425 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:05.323399 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:05.323464 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:05.323474 | orchestrator | + sleep 5 2025-09-27 00:37:10.329106 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:10.370290 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:10.370341 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:10.370354 | orchestrator | + sleep 5 2025-09-27 00:37:15.375817 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:15.411938 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:15.412006 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:15.412018 | orchestrator | + sleep 5 2025-09-27 00:37:20.418011 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:20.461020 | orchestrator | + [[ unhealthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:20.461059 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:20.461070 | orchestrator | + sleep 5 2025-09-27 00:37:25.467447 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:25.498160 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:25.498255 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:25.498270 | orchestrator | + sleep 5 2025-09-27 00:37:30.503307 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:30.525180 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:30.525276 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:30.525292 | orchestrator | + sleep 5 2025-09-27 00:37:35.528501 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:35.565459 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:35.565542 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:35.565557 | orchestrator | + sleep 5 2025-09-27 00:37:40.571289 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:40.603316 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:40.603375 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:40.603390 | orchestrator | + sleep 5 2025-09-27 00:37:45.607105 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:45.647007 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:45.647080 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:45.647093 | orchestrator | + sleep 5 2025-09-27 00:37:50.651995 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:50.691722 | orchestrator | + [[ starting == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:50.691832 | orchestrator | + (( attempt_num++ == max_attempts )) 2025-09-27 00:37:50.691844 | orchestrator | + sleep 5 2025-09-27 00:37:55.696778 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' ceph-ansible 2025-09-27 00:37:55.732439 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:55.732519 | orchestrator | + wait_for_container_healthy 60 kolla-ansible 2025-09-27 00:37:55.732533 | orchestrator | + local max_attempts=60 2025-09-27 00:37:55.732545 | orchestrator | + local name=kolla-ansible 2025-09-27 00:37:55.732555 | orchestrator | + local attempt_num=1 2025-09-27 00:37:55.733704 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' kolla-ansible 2025-09-27 00:37:55.763203 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:55.763294 | orchestrator | + wait_for_container_healthy 60 osism-ansible 2025-09-27 00:37:55.763309 | orchestrator | + local max_attempts=60 2025-09-27 00:37:55.763321 | orchestrator | + local name=osism-ansible 2025-09-27 00:37:55.763332 | orchestrator | + local attempt_num=1 2025-09-27 00:37:55.763615 | orchestrator | ++ /usr/bin/docker inspect -f '{{.State.Health.Status}}' osism-ansible 2025-09-27 00:37:55.808030 | orchestrator | + [[ healthy == \h\e\a\l\t\h\y ]] 2025-09-27 00:37:55.808104 | orchestrator | + [[ true == \t\r\u\e ]] 2025-09-27 00:37:55.808119 | orchestrator | + sh -c /opt/configuration/scripts/disable-ara.sh 2025-09-27 00:37:55.992093 | orchestrator | ARA in ceph-ansible already disabled. 2025-09-27 00:37:56.160562 | orchestrator | ARA in kolla-ansible already disabled. 2025-09-27 00:37:56.320669 | orchestrator | ARA in osism-ansible already disabled. 2025-09-27 00:37:56.499293 | orchestrator | ARA in osism-kubernetes already disabled. 2025-09-27 00:37:56.499532 | orchestrator | + osism apply gather-facts 2025-09-27 00:38:08.734365 | orchestrator | 2025-09-27 00:38:08 | INFO  | Task 1c2568aa-1413-49bc-8194-4dd77ec6222e (gather-facts) was prepared for execution. 2025-09-27 00:38:08.734472 | orchestrator | 2025-09-27 00:38:08 | INFO  | It takes a moment until task 1c2568aa-1413-49bc-8194-4dd77ec6222e (gather-facts) has been started and output is visible here. 2025-09-27 00:38:21.672168 | orchestrator | 2025-09-27 00:38:21.672328 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-27 00:38:21.672345 | orchestrator | 2025-09-27 00:38:21.672357 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-27 00:38:21.672390 | orchestrator | Saturday 27 September 2025 00:38:12 +0000 (0:00:00.200) 0:00:00.200 **** 2025-09-27 00:38:21.672402 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:38:21.672414 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:38:21.672425 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:38:21.672436 | orchestrator | ok: [testbed-manager] 2025-09-27 00:38:21.672447 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:38:21.672458 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:38:21.672471 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:38:21.672483 | orchestrator | 2025-09-27 00:38:21.672495 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-09-27 00:38:21.672506 | orchestrator | 2025-09-27 00:38:21.672519 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-09-27 00:38:21.672530 | orchestrator | Saturday 27 September 2025 00:38:20 +0000 (0:00:08.528) 0:00:08.728 **** 2025-09-27 00:38:21.672542 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:38:21.672556 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:38:21.672567 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:38:21.672579 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:38:21.672591 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:38:21.672603 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:38:21.672615 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:38:21.672627 | orchestrator | 2025-09-27 00:38:21.672638 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:38:21.672651 | orchestrator | testbed-manager : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:38:21.672664 | orchestrator | testbed-node-0 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:38:21.672701 | orchestrator | testbed-node-1 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:38:21.672714 | orchestrator | testbed-node-2 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:38:21.672726 | orchestrator | testbed-node-3 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:38:21.672738 | orchestrator | testbed-node-4 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:38:21.672751 | orchestrator | testbed-node-5 : ok=1  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:38:21.672761 | orchestrator | 2025-09-27 00:38:21.672769 | orchestrator | 2025-09-27 00:38:21.672776 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:38:21.672784 | orchestrator | Saturday 27 September 2025 00:38:21 +0000 (0:00:00.448) 0:00:09.177 **** 2025-09-27 00:38:21.672792 | orchestrator | =============================================================================== 2025-09-27 00:38:21.672799 | orchestrator | Gathers facts about hosts ----------------------------------------------- 8.53s 2025-09-27 00:38:21.672808 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.45s 2025-09-27 00:38:21.847773 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/001-helpers.sh /usr/local/bin/deploy-helper 2025-09-27 00:38:21.856615 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/500-kubernetes.sh /usr/local/bin/deploy-kubernetes 2025-09-27 00:38:21.866549 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/510-clusterapi.sh /usr/local/bin/deploy-kubernetes-clusterapi 2025-09-27 00:38:21.875890 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-ansible.sh /usr/local/bin/deploy-ceph-with-ansible 2025-09-27 00:38:21.892875 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/100-ceph-with-rook.sh /usr/local/bin/deploy-ceph-with-rook 2025-09-27 00:38:21.903355 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/200-infrastructure.sh /usr/local/bin/deploy-infrastructure 2025-09-27 00:38:21.912531 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/300-openstack.sh /usr/local/bin/deploy-openstack 2025-09-27 00:38:21.923007 | orchestrator | + sudo ln -sf /opt/configuration/scripts/deploy/400-monitoring.sh /usr/local/bin/deploy-monitoring 2025-09-27 00:38:21.938356 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/500-kubernetes.sh /usr/local/bin/upgrade-kubernetes 2025-09-27 00:38:21.950143 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/510-clusterapi.sh /usr/local/bin/upgrade-kubernetes-clusterapi 2025-09-27 00:38:21.960069 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-ansible.sh /usr/local/bin/upgrade-ceph-with-ansible 2025-09-27 00:38:21.975781 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/100-ceph-with-rook.sh /usr/local/bin/upgrade-ceph-with-rook 2025-09-27 00:38:21.988201 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/200-infrastructure.sh /usr/local/bin/upgrade-infrastructure 2025-09-27 00:38:22.002179 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/300-openstack.sh /usr/local/bin/upgrade-openstack 2025-09-27 00:38:22.017846 | orchestrator | + sudo ln -sf /opt/configuration/scripts/upgrade/400-monitoring.sh /usr/local/bin/upgrade-monitoring 2025-09-27 00:38:22.035750 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/300-openstack.sh /usr/local/bin/bootstrap-openstack 2025-09-27 00:38:22.048389 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/301-openstack-octavia-amhpora-image.sh /usr/local/bin/bootstrap-octavia 2025-09-27 00:38:22.062912 | orchestrator | + sudo ln -sf /opt/configuration/scripts/bootstrap/302-openstack-k8s-clusterapi-images.sh /usr/local/bin/bootstrap-clusterapi 2025-09-27 00:38:22.072054 | orchestrator | + sudo ln -sf /opt/configuration/scripts/disable-local-registry.sh /usr/local/bin/disable-local-registry 2025-09-27 00:38:22.081470 | orchestrator | + sudo ln -sf /opt/configuration/scripts/pull-images.sh /usr/local/bin/pull-images 2025-09-27 00:38:22.090660 | orchestrator | + [[ false == \t\r\u\e ]] 2025-09-27 00:38:22.314919 | orchestrator | ok: Runtime: 0:22:54.977306 2025-09-27 00:38:22.430413 | 2025-09-27 00:38:22.430563 | TASK [Deploy services] 2025-09-27 00:38:22.963022 | orchestrator | skipping: Conditional result was False 2025-09-27 00:38:22.977036 | 2025-09-27 00:38:22.977198 | TASK [Deploy in a nutshell] 2025-09-27 00:38:23.660877 | orchestrator | + set -e 2025-09-27 00:38:23.662171 | orchestrator | 2025-09-27 00:38:23.662198 | orchestrator | # PULL IMAGES 2025-09-27 00:38:23.662232 | orchestrator | 2025-09-27 00:38:23.662241 | orchestrator | + source /opt/configuration/scripts/include.sh 2025-09-27 00:38:23.662251 | orchestrator | ++ export INTERACTIVE=false 2025-09-27 00:38:23.662258 | orchestrator | ++ INTERACTIVE=false 2025-09-27 00:38:23.662279 | orchestrator | ++ export OSISM_APPLY_RETRY=1 2025-09-27 00:38:23.662288 | orchestrator | ++ OSISM_APPLY_RETRY=1 2025-09-27 00:38:23.662294 | orchestrator | + source /opt/manager-vars.sh 2025-09-27 00:38:23.662299 | orchestrator | ++ export NUMBER_OF_NODES=6 2025-09-27 00:38:23.662307 | orchestrator | ++ NUMBER_OF_NODES=6 2025-09-27 00:38:23.662311 | orchestrator | ++ export CEPH_VERSION=reef 2025-09-27 00:38:23.662318 | orchestrator | ++ CEPH_VERSION=reef 2025-09-27 00:38:23.662322 | orchestrator | ++ export CONFIGURATION_VERSION=main 2025-09-27 00:38:23.662329 | orchestrator | ++ CONFIGURATION_VERSION=main 2025-09-27 00:38:23.662333 | orchestrator | ++ export MANAGER_VERSION=latest 2025-09-27 00:38:23.662339 | orchestrator | ++ MANAGER_VERSION=latest 2025-09-27 00:38:23.662343 | orchestrator | ++ export OPENSTACK_VERSION=2024.2 2025-09-27 00:38:23.662348 | orchestrator | ++ OPENSTACK_VERSION=2024.2 2025-09-27 00:38:23.662352 | orchestrator | ++ export ARA=false 2025-09-27 00:38:23.662355 | orchestrator | ++ ARA=false 2025-09-27 00:38:23.662359 | orchestrator | ++ export DEPLOY_MODE=manager 2025-09-27 00:38:23.662363 | orchestrator | ++ DEPLOY_MODE=manager 2025-09-27 00:38:23.662367 | orchestrator | ++ export TEMPEST=true 2025-09-27 00:38:23.662371 | orchestrator | ++ TEMPEST=true 2025-09-27 00:38:23.662375 | orchestrator | ++ export IS_ZUUL=true 2025-09-27 00:38:23.662378 | orchestrator | ++ IS_ZUUL=true 2025-09-27 00:38:23.662382 | orchestrator | ++ export MANAGER_PUBLIC_IP_ADDRESS=81.163.193.20 2025-09-27 00:38:23.662386 | orchestrator | ++ MANAGER_PUBLIC_IP_ADDRESS=81.163.193.20 2025-09-27 00:38:23.662390 | orchestrator | ++ export EXTERNAL_API=false 2025-09-27 00:38:23.662393 | orchestrator | ++ EXTERNAL_API=false 2025-09-27 00:38:23.662397 | orchestrator | ++ export IMAGE_USER=ubuntu 2025-09-27 00:38:23.662401 | orchestrator | ++ IMAGE_USER=ubuntu 2025-09-27 00:38:23.662405 | orchestrator | ++ export IMAGE_NODE_USER=ubuntu 2025-09-27 00:38:23.662409 | orchestrator | ++ IMAGE_NODE_USER=ubuntu 2025-09-27 00:38:23.662413 | orchestrator | ++ export CEPH_STACK=ceph-ansible 2025-09-27 00:38:23.662416 | orchestrator | ++ CEPH_STACK=ceph-ansible 2025-09-27 00:38:23.662420 | orchestrator | + echo 2025-09-27 00:38:23.662424 | orchestrator | + echo '# PULL IMAGES' 2025-09-27 00:38:23.662428 | orchestrator | + echo 2025-09-27 00:38:23.662543 | orchestrator | ++ semver latest 7.0.0 2025-09-27 00:38:23.715894 | orchestrator | + [[ -1 -ge 0 ]] 2025-09-27 00:38:23.715927 | orchestrator | + [[ latest == \l\a\t\e\s\t ]] 2025-09-27 00:38:23.715933 | orchestrator | + osism apply --no-wait -r 2 -e custom pull-images 2025-09-27 00:38:25.373888 | orchestrator | 2025-09-27 00:38:25 | INFO  | Trying to run play pull-images in environment custom 2025-09-27 00:38:35.515979 | orchestrator | 2025-09-27 00:38:35 | INFO  | Task 08016374-90b4-4aed-b7fd-4017927ce5b9 (pull-images) was prepared for execution. 2025-09-27 00:38:35.516073 | orchestrator | 2025-09-27 00:38:35 | INFO  | Task 08016374-90b4-4aed-b7fd-4017927ce5b9 is running in background. No more output. Check ARA for logs. 2025-09-27 00:38:37.716976 | orchestrator | 2025-09-27 00:38:37 | INFO  | Trying to run play wipe-partitions in environment custom 2025-09-27 00:38:47.911564 | orchestrator | 2025-09-27 00:38:47 | INFO  | Task cd58c15c-0812-463f-85ac-e9d4ce60a2cc (wipe-partitions) was prepared for execution. 2025-09-27 00:38:47.911700 | orchestrator | 2025-09-27 00:38:47 | INFO  | It takes a moment until task cd58c15c-0812-463f-85ac-e9d4ce60a2cc (wipe-partitions) has been started and output is visible here. 2025-09-27 00:38:59.742431 | orchestrator | 2025-09-27 00:38:59.742548 | orchestrator | PLAY [Wipe partitions] ********************************************************* 2025-09-27 00:38:59.742565 | orchestrator | 2025-09-27 00:38:59.742577 | orchestrator | TASK [Find all logical devices owned by UID 167] ******************************* 2025-09-27 00:38:59.742594 | orchestrator | Saturday 27 September 2025 00:38:51 +0000 (0:00:00.132) 0:00:00.132 **** 2025-09-27 00:38:59.742605 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:38:59.742617 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:38:59.742629 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:38:59.742640 | orchestrator | 2025-09-27 00:38:59.742651 | orchestrator | TASK [Remove all rook related logical devices] ********************************* 2025-09-27 00:38:59.742686 | orchestrator | Saturday 27 September 2025 00:38:51 +0000 (0:00:00.570) 0:00:00.702 **** 2025-09-27 00:38:59.742698 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:38:59.742708 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:38:59.742724 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:38:59.742735 | orchestrator | 2025-09-27 00:38:59.742747 | orchestrator | TASK [Find all logical devices with prefix ceph] ******************************* 2025-09-27 00:38:59.742758 | orchestrator | Saturday 27 September 2025 00:38:52 +0000 (0:00:00.240) 0:00:00.943 **** 2025-09-27 00:38:59.742768 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:38:59.742780 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:38:59.742791 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:38:59.742801 | orchestrator | 2025-09-27 00:38:59.742813 | orchestrator | TASK [Remove all ceph related logical devices] ********************************* 2025-09-27 00:38:59.742823 | orchestrator | Saturday 27 September 2025 00:38:52 +0000 (0:00:00.667) 0:00:01.610 **** 2025-09-27 00:38:59.742834 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:38:59.742845 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:38:59.742856 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:38:59.742866 | orchestrator | 2025-09-27 00:38:59.742877 | orchestrator | TASK [Check device availability] *********************************************** 2025-09-27 00:38:59.742887 | orchestrator | Saturday 27 September 2025 00:38:53 +0000 (0:00:00.239) 0:00:01.850 **** 2025-09-27 00:38:59.742898 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2025-09-27 00:38:59.742913 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2025-09-27 00:38:59.742924 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2025-09-27 00:38:59.742935 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2025-09-27 00:38:59.742945 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2025-09-27 00:38:59.742956 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2025-09-27 00:38:59.742967 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2025-09-27 00:38:59.742978 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2025-09-27 00:38:59.742988 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2025-09-27 00:38:59.742999 | orchestrator | 2025-09-27 00:38:59.743009 | orchestrator | TASK [Wipe partitions with wipefs] ********************************************* 2025-09-27 00:38:59.743021 | orchestrator | Saturday 27 September 2025 00:38:54 +0000 (0:00:01.172) 0:00:03.022 **** 2025-09-27 00:38:59.743032 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdb) 2025-09-27 00:38:59.743043 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdb) 2025-09-27 00:38:59.743054 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdb) 2025-09-27 00:38:59.743064 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdc) 2025-09-27 00:38:59.743075 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdc) 2025-09-27 00:38:59.743085 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdc) 2025-09-27 00:38:59.743096 | orchestrator | ok: [testbed-node-3] => (item=/dev/sdd) 2025-09-27 00:38:59.743106 | orchestrator | ok: [testbed-node-5] => (item=/dev/sdd) 2025-09-27 00:38:59.743117 | orchestrator | ok: [testbed-node-4] => (item=/dev/sdd) 2025-09-27 00:38:59.743128 | orchestrator | 2025-09-27 00:38:59.743139 | orchestrator | TASK [Overwrite first 32M with zeros] ****************************************** 2025-09-27 00:38:59.743149 | orchestrator | Saturday 27 September 2025 00:38:55 +0000 (0:00:01.435) 0:00:04.458 **** 2025-09-27 00:38:59.743160 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdb) 2025-09-27 00:38:59.743171 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdb) 2025-09-27 00:38:59.743181 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdb) 2025-09-27 00:38:59.743192 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdc) 2025-09-27 00:38:59.743226 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdc) 2025-09-27 00:38:59.743238 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdc) 2025-09-27 00:38:59.743248 | orchestrator | changed: [testbed-node-3] => (item=/dev/sdd) 2025-09-27 00:38:59.743269 | orchestrator | changed: [testbed-node-4] => (item=/dev/sdd) 2025-09-27 00:38:59.743286 | orchestrator | changed: [testbed-node-5] => (item=/dev/sdd) 2025-09-27 00:38:59.743297 | orchestrator | 2025-09-27 00:38:59.743308 | orchestrator | TASK [Reload udev rules] ******************************************************* 2025-09-27 00:38:59.743319 | orchestrator | Saturday 27 September 2025 00:38:58 +0000 (0:00:02.396) 0:00:06.855 **** 2025-09-27 00:38:59.743330 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:38:59.743340 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:38:59.743351 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:38:59.743362 | orchestrator | 2025-09-27 00:38:59.743372 | orchestrator | TASK [Request device events from the kernel] *********************************** 2025-09-27 00:38:59.743383 | orchestrator | Saturday 27 September 2025 00:38:58 +0000 (0:00:00.625) 0:00:07.480 **** 2025-09-27 00:38:59.743394 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:38:59.743404 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:38:59.743415 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:38:59.743426 | orchestrator | 2025-09-27 00:38:59.743436 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:38:59.743450 | orchestrator | testbed-node-3 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:38:59.743461 | orchestrator | testbed-node-4 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:38:59.743490 | orchestrator | testbed-node-5 : ok=7  changed=5  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:38:59.743502 | orchestrator | 2025-09-27 00:38:59.743512 | orchestrator | 2025-09-27 00:38:59.743523 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:38:59.743534 | orchestrator | Saturday 27 September 2025 00:38:59 +0000 (0:00:00.633) 0:00:08.113 **** 2025-09-27 00:38:59.743544 | orchestrator | =============================================================================== 2025-09-27 00:38:59.743555 | orchestrator | Overwrite first 32M with zeros ------------------------------------------ 2.40s 2025-09-27 00:38:59.743566 | orchestrator | Wipe partitions with wipefs --------------------------------------------- 1.44s 2025-09-27 00:38:59.743577 | orchestrator | Check device availability ----------------------------------------------- 1.17s 2025-09-27 00:38:59.743588 | orchestrator | Find all logical devices with prefix ceph ------------------------------- 0.67s 2025-09-27 00:38:59.743598 | orchestrator | Request device events from the kernel ----------------------------------- 0.63s 2025-09-27 00:38:59.743609 | orchestrator | Reload udev rules ------------------------------------------------------- 0.63s 2025-09-27 00:38:59.743620 | orchestrator | Find all logical devices owned by UID 167 ------------------------------- 0.57s 2025-09-27 00:38:59.743630 | orchestrator | Remove all rook related logical devices --------------------------------- 0.24s 2025-09-27 00:38:59.743641 | orchestrator | Remove all ceph related logical devices --------------------------------- 0.24s 2025-09-27 00:39:11.981644 | orchestrator | 2025-09-27 00:39:11 | INFO  | Task 73c642ce-9349-48b6-bb03-72b8d315ef96 (facts) was prepared for execution. 2025-09-27 00:39:11.981763 | orchestrator | 2025-09-27 00:39:11 | INFO  | It takes a moment until task 73c642ce-9349-48b6-bb03-72b8d315ef96 (facts) has been started and output is visible here. 2025-09-27 00:39:24.093180 | orchestrator | 2025-09-27 00:39:24.093352 | orchestrator | PLAY [Apply role facts] ******************************************************** 2025-09-27 00:39:24.093371 | orchestrator | 2025-09-27 00:39:24.093383 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-09-27 00:39:24.093395 | orchestrator | Saturday 27 September 2025 00:39:15 +0000 (0:00:00.265) 0:00:00.265 **** 2025-09-27 00:39:24.093407 | orchestrator | ok: [testbed-manager] 2025-09-27 00:39:24.093419 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:39:24.093430 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:39:24.093463 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:39:24.093474 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:39:24.093485 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:39:24.093496 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:39:24.093506 | orchestrator | 2025-09-27 00:39:24.093517 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-09-27 00:39:24.093528 | orchestrator | Saturday 27 September 2025 00:39:17 +0000 (0:00:01.072) 0:00:01.337 **** 2025-09-27 00:39:24.093539 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:39:24.093550 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:39:24.093561 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:39:24.093571 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:39:24.093582 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:24.093593 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:24.093603 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:39:24.093614 | orchestrator | 2025-09-27 00:39:24.093625 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-27 00:39:24.093636 | orchestrator | 2025-09-27 00:39:24.093664 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-27 00:39:24.093675 | orchestrator | Saturday 27 September 2025 00:39:18 +0000 (0:00:01.217) 0:00:02.554 **** 2025-09-27 00:39:24.093686 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:39:24.093696 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:39:24.093708 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:39:24.093721 | orchestrator | ok: [testbed-manager] 2025-09-27 00:39:24.093734 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:39:24.093746 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:39:24.093759 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:39:24.093771 | orchestrator | 2025-09-27 00:39:24.093784 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-09-27 00:39:24.093797 | orchestrator | 2025-09-27 00:39:24.093810 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-09-27 00:39:24.093822 | orchestrator | Saturday 27 September 2025 00:39:23 +0000 (0:00:05.003) 0:00:07.558 **** 2025-09-27 00:39:24.093835 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:39:24.093848 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:39:24.093861 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:39:24.093873 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:39:24.093886 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:24.093898 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:24.093911 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:39:24.093924 | orchestrator | 2025-09-27 00:39:24.093937 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:39:24.093950 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:39:24.093964 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:39:24.093977 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:39:24.093990 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:39:24.094003 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:39:24.094076 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:39:24.094090 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:39:24.094103 | orchestrator | 2025-09-27 00:39:24.094124 | orchestrator | 2025-09-27 00:39:24.094135 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:39:24.094146 | orchestrator | Saturday 27 September 2025 00:39:23 +0000 (0:00:00.501) 0:00:08.059 **** 2025-09-27 00:39:24.094156 | orchestrator | =============================================================================== 2025-09-27 00:39:24.094167 | orchestrator | Gathers facts about hosts ----------------------------------------------- 5.00s 2025-09-27 00:39:24.094178 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.22s 2025-09-27 00:39:24.094189 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.07s 2025-09-27 00:39:24.094200 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.50s 2025-09-27 00:39:26.286517 | orchestrator | 2025-09-27 00:39:26 | INFO  | Task 48e4d13d-b97b-4e08-bf2b-d4b4cb15369f (ceph-configure-lvm-volumes) was prepared for execution. 2025-09-27 00:39:26.286611 | orchestrator | 2025-09-27 00:39:26 | INFO  | It takes a moment until task 48e4d13d-b97b-4e08-bf2b-d4b4cb15369f (ceph-configure-lvm-volumes) has been started and output is visible here. 2025-09-27 00:39:37.851865 | orchestrator | 2025-09-27 00:39:37.851976 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-09-27 00:39:37.851994 | orchestrator | 2025-09-27 00:39:37.852006 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-27 00:39:37.852018 | orchestrator | Saturday 27 September 2025 00:39:30 +0000 (0:00:00.321) 0:00:00.321 **** 2025-09-27 00:39:37.852029 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-09-27 00:39:37.852040 | orchestrator | 2025-09-27 00:39:37.852051 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-27 00:39:37.852062 | orchestrator | Saturday 27 September 2025 00:39:30 +0000 (0:00:00.240) 0:00:00.562 **** 2025-09-27 00:39:37.852072 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:39:37.852084 | orchestrator | 2025-09-27 00:39:37.852095 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852105 | orchestrator | Saturday 27 September 2025 00:39:30 +0000 (0:00:00.217) 0:00:00.780 **** 2025-09-27 00:39:37.852116 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2025-09-27 00:39:37.852127 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2025-09-27 00:39:37.852138 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2025-09-27 00:39:37.852161 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2025-09-27 00:39:37.852172 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2025-09-27 00:39:37.852183 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2025-09-27 00:39:37.852193 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2025-09-27 00:39:37.852265 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2025-09-27 00:39:37.852279 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2025-09-27 00:39:37.852290 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2025-09-27 00:39:37.852300 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2025-09-27 00:39:37.852311 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2025-09-27 00:39:37.852321 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2025-09-27 00:39:37.852332 | orchestrator | 2025-09-27 00:39:37.852343 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852353 | orchestrator | Saturday 27 September 2025 00:39:31 +0000 (0:00:00.342) 0:00:01.122 **** 2025-09-27 00:39:37.852365 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.852394 | orchestrator | 2025-09-27 00:39:37.852407 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852419 | orchestrator | Saturday 27 September 2025 00:39:31 +0000 (0:00:00.466) 0:00:01.589 **** 2025-09-27 00:39:37.852431 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.852444 | orchestrator | 2025-09-27 00:39:37.852456 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852468 | orchestrator | Saturday 27 September 2025 00:39:31 +0000 (0:00:00.195) 0:00:01.785 **** 2025-09-27 00:39:37.852480 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.852492 | orchestrator | 2025-09-27 00:39:37.852504 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852517 | orchestrator | Saturday 27 September 2025 00:39:32 +0000 (0:00:00.208) 0:00:01.994 **** 2025-09-27 00:39:37.852529 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.852545 | orchestrator | 2025-09-27 00:39:37.852557 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852569 | orchestrator | Saturday 27 September 2025 00:39:32 +0000 (0:00:00.208) 0:00:02.203 **** 2025-09-27 00:39:37.852582 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.852593 | orchestrator | 2025-09-27 00:39:37.852606 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852618 | orchestrator | Saturday 27 September 2025 00:39:32 +0000 (0:00:00.200) 0:00:02.403 **** 2025-09-27 00:39:37.852631 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.852643 | orchestrator | 2025-09-27 00:39:37.852655 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852668 | orchestrator | Saturday 27 September 2025 00:39:32 +0000 (0:00:00.203) 0:00:02.607 **** 2025-09-27 00:39:37.852680 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.852693 | orchestrator | 2025-09-27 00:39:37.852705 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852717 | orchestrator | Saturday 27 September 2025 00:39:32 +0000 (0:00:00.184) 0:00:02.791 **** 2025-09-27 00:39:37.852729 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.852741 | orchestrator | 2025-09-27 00:39:37.852753 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852763 | orchestrator | Saturday 27 September 2025 00:39:33 +0000 (0:00:00.213) 0:00:03.004 **** 2025-09-27 00:39:37.852774 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a) 2025-09-27 00:39:37.852785 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a) 2025-09-27 00:39:37.852796 | orchestrator | 2025-09-27 00:39:37.852807 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852817 | orchestrator | Saturday 27 September 2025 00:39:33 +0000 (0:00:00.389) 0:00:03.394 **** 2025-09-27 00:39:37.852846 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e) 2025-09-27 00:39:37.852857 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e) 2025-09-27 00:39:37.852868 | orchestrator | 2025-09-27 00:39:37.852879 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852889 | orchestrator | Saturday 27 September 2025 00:39:33 +0000 (0:00:00.397) 0:00:03.791 **** 2025-09-27 00:39:37.852906 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696) 2025-09-27 00:39:37.852918 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696) 2025-09-27 00:39:37.852928 | orchestrator | 2025-09-27 00:39:37.852939 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.852949 | orchestrator | Saturday 27 September 2025 00:39:34 +0000 (0:00:00.602) 0:00:04.394 **** 2025-09-27 00:39:37.852960 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc) 2025-09-27 00:39:37.852979 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc) 2025-09-27 00:39:37.852989 | orchestrator | 2025-09-27 00:39:37.853000 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:37.853011 | orchestrator | Saturday 27 September 2025 00:39:35 +0000 (0:00:00.605) 0:00:05.000 **** 2025-09-27 00:39:37.853021 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-27 00:39:37.853032 | orchestrator | 2025-09-27 00:39:37.853042 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853053 | orchestrator | Saturday 27 September 2025 00:39:35 +0000 (0:00:00.765) 0:00:05.765 **** 2025-09-27 00:39:37.853063 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2025-09-27 00:39:37.853074 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2025-09-27 00:39:37.853084 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2025-09-27 00:39:37.853095 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2025-09-27 00:39:37.853105 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2025-09-27 00:39:37.853116 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2025-09-27 00:39:37.853126 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2025-09-27 00:39:37.853137 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2025-09-27 00:39:37.853147 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2025-09-27 00:39:37.853158 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2025-09-27 00:39:37.853168 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2025-09-27 00:39:37.853179 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2025-09-27 00:39:37.853190 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2025-09-27 00:39:37.853201 | orchestrator | 2025-09-27 00:39:37.853238 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853249 | orchestrator | Saturday 27 September 2025 00:39:36 +0000 (0:00:00.388) 0:00:06.154 **** 2025-09-27 00:39:37.853259 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.853270 | orchestrator | 2025-09-27 00:39:37.853281 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853291 | orchestrator | Saturday 27 September 2025 00:39:36 +0000 (0:00:00.196) 0:00:06.350 **** 2025-09-27 00:39:37.853302 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.853312 | orchestrator | 2025-09-27 00:39:37.853323 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853334 | orchestrator | Saturday 27 September 2025 00:39:36 +0000 (0:00:00.195) 0:00:06.546 **** 2025-09-27 00:39:37.853344 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.853355 | orchestrator | 2025-09-27 00:39:37.853365 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853376 | orchestrator | Saturday 27 September 2025 00:39:36 +0000 (0:00:00.198) 0:00:06.744 **** 2025-09-27 00:39:37.853386 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.853397 | orchestrator | 2025-09-27 00:39:37.853408 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853432 | orchestrator | Saturday 27 September 2025 00:39:37 +0000 (0:00:00.195) 0:00:06.940 **** 2025-09-27 00:39:37.853443 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.853463 | orchestrator | 2025-09-27 00:39:37.853474 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853492 | orchestrator | Saturday 27 September 2025 00:39:37 +0000 (0:00:00.201) 0:00:07.141 **** 2025-09-27 00:39:37.853503 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.853514 | orchestrator | 2025-09-27 00:39:37.853524 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853535 | orchestrator | Saturday 27 September 2025 00:39:37 +0000 (0:00:00.216) 0:00:07.358 **** 2025-09-27 00:39:37.853546 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:37.853556 | orchestrator | 2025-09-27 00:39:37.853567 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:37.853578 | orchestrator | Saturday 27 September 2025 00:39:37 +0000 (0:00:00.217) 0:00:07.575 **** 2025-09-27 00:39:37.853596 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.103484 | orchestrator | 2025-09-27 00:39:45.103594 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:45.103611 | orchestrator | Saturday 27 September 2025 00:39:37 +0000 (0:00:00.201) 0:00:07.776 **** 2025-09-27 00:39:45.103623 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2025-09-27 00:39:45.103635 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2025-09-27 00:39:45.103647 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2025-09-27 00:39:45.103658 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2025-09-27 00:39:45.103669 | orchestrator | 2025-09-27 00:39:45.103680 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:45.103691 | orchestrator | Saturday 27 September 2025 00:39:38 +0000 (0:00:00.989) 0:00:08.765 **** 2025-09-27 00:39:45.103719 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.103730 | orchestrator | 2025-09-27 00:39:45.103741 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:45.103752 | orchestrator | Saturday 27 September 2025 00:39:39 +0000 (0:00:00.187) 0:00:08.953 **** 2025-09-27 00:39:45.103763 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.103774 | orchestrator | 2025-09-27 00:39:45.103784 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:45.103795 | orchestrator | Saturday 27 September 2025 00:39:39 +0000 (0:00:00.189) 0:00:09.142 **** 2025-09-27 00:39:45.103806 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.103817 | orchestrator | 2025-09-27 00:39:45.103828 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:45.103839 | orchestrator | Saturday 27 September 2025 00:39:39 +0000 (0:00:00.177) 0:00:09.320 **** 2025-09-27 00:39:45.103850 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.103860 | orchestrator | 2025-09-27 00:39:45.103871 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-09-27 00:39:45.103882 | orchestrator | Saturday 27 September 2025 00:39:39 +0000 (0:00:00.195) 0:00:09.515 **** 2025-09-27 00:39:45.103893 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': None}) 2025-09-27 00:39:45.103904 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': None}) 2025-09-27 00:39:45.103915 | orchestrator | 2025-09-27 00:39:45.103926 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-09-27 00:39:45.103937 | orchestrator | Saturday 27 September 2025 00:39:39 +0000 (0:00:00.169) 0:00:09.685 **** 2025-09-27 00:39:45.103948 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.103958 | orchestrator | 2025-09-27 00:39:45.103969 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-09-27 00:39:45.103980 | orchestrator | Saturday 27 September 2025 00:39:39 +0000 (0:00:00.146) 0:00:09.831 **** 2025-09-27 00:39:45.103991 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104001 | orchestrator | 2025-09-27 00:39:45.104012 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-09-27 00:39:45.104023 | orchestrator | Saturday 27 September 2025 00:39:40 +0000 (0:00:00.138) 0:00:09.969 **** 2025-09-27 00:39:45.104035 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104069 | orchestrator | 2025-09-27 00:39:45.104082 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-09-27 00:39:45.104094 | orchestrator | Saturday 27 September 2025 00:39:40 +0000 (0:00:00.153) 0:00:10.123 **** 2025-09-27 00:39:45.104107 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:39:45.104119 | orchestrator | 2025-09-27 00:39:45.104131 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-09-27 00:39:45.104143 | orchestrator | Saturday 27 September 2025 00:39:40 +0000 (0:00:00.132) 0:00:10.255 **** 2025-09-27 00:39:45.104156 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '025d8a54-72cd-5dfc-843f-2890244ba468'}}) 2025-09-27 00:39:45.104169 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9ca7935d-e986-5962-b530-505e6c7ac609'}}) 2025-09-27 00:39:45.104181 | orchestrator | 2025-09-27 00:39:45.104193 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-09-27 00:39:45.104232 | orchestrator | Saturday 27 September 2025 00:39:40 +0000 (0:00:00.161) 0:00:10.416 **** 2025-09-27 00:39:45.104245 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '025d8a54-72cd-5dfc-843f-2890244ba468'}})  2025-09-27 00:39:45.104266 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9ca7935d-e986-5962-b530-505e6c7ac609'}})  2025-09-27 00:39:45.104279 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104292 | orchestrator | 2025-09-27 00:39:45.104304 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-09-27 00:39:45.104316 | orchestrator | Saturday 27 September 2025 00:39:40 +0000 (0:00:00.148) 0:00:10.565 **** 2025-09-27 00:39:45.104328 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '025d8a54-72cd-5dfc-843f-2890244ba468'}})  2025-09-27 00:39:45.104340 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9ca7935d-e986-5962-b530-505e6c7ac609'}})  2025-09-27 00:39:45.104352 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104365 | orchestrator | 2025-09-27 00:39:45.104378 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-09-27 00:39:45.104389 | orchestrator | Saturday 27 September 2025 00:39:41 +0000 (0:00:00.376) 0:00:10.941 **** 2025-09-27 00:39:45.104400 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '025d8a54-72cd-5dfc-843f-2890244ba468'}})  2025-09-27 00:39:45.104411 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9ca7935d-e986-5962-b530-505e6c7ac609'}})  2025-09-27 00:39:45.104421 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104432 | orchestrator | 2025-09-27 00:39:45.104460 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-09-27 00:39:45.104472 | orchestrator | Saturday 27 September 2025 00:39:41 +0000 (0:00:00.175) 0:00:11.117 **** 2025-09-27 00:39:45.104482 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:39:45.104493 | orchestrator | 2025-09-27 00:39:45.104504 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-09-27 00:39:45.104514 | orchestrator | Saturday 27 September 2025 00:39:41 +0000 (0:00:00.156) 0:00:11.273 **** 2025-09-27 00:39:45.104525 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:39:45.104535 | orchestrator | 2025-09-27 00:39:45.104546 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-09-27 00:39:45.104557 | orchestrator | Saturday 27 September 2025 00:39:41 +0000 (0:00:00.150) 0:00:11.424 **** 2025-09-27 00:39:45.104567 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104577 | orchestrator | 2025-09-27 00:39:45.104588 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-09-27 00:39:45.104599 | orchestrator | Saturday 27 September 2025 00:39:41 +0000 (0:00:00.134) 0:00:11.559 **** 2025-09-27 00:39:45.104609 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104620 | orchestrator | 2025-09-27 00:39:45.104639 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-09-27 00:39:45.104650 | orchestrator | Saturday 27 September 2025 00:39:41 +0000 (0:00:00.138) 0:00:11.697 **** 2025-09-27 00:39:45.104661 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104671 | orchestrator | 2025-09-27 00:39:45.104682 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-09-27 00:39:45.104692 | orchestrator | Saturday 27 September 2025 00:39:41 +0000 (0:00:00.133) 0:00:11.831 **** 2025-09-27 00:39:45.104703 | orchestrator | ok: [testbed-node-3] => { 2025-09-27 00:39:45.104713 | orchestrator |  "ceph_osd_devices": { 2025-09-27 00:39:45.104725 | orchestrator |  "sdb": { 2025-09-27 00:39:45.104736 | orchestrator |  "osd_lvm_uuid": "025d8a54-72cd-5dfc-843f-2890244ba468" 2025-09-27 00:39:45.104748 | orchestrator |  }, 2025-09-27 00:39:45.104758 | orchestrator |  "sdc": { 2025-09-27 00:39:45.104769 | orchestrator |  "osd_lvm_uuid": "9ca7935d-e986-5962-b530-505e6c7ac609" 2025-09-27 00:39:45.104780 | orchestrator |  } 2025-09-27 00:39:45.104790 | orchestrator |  } 2025-09-27 00:39:45.104801 | orchestrator | } 2025-09-27 00:39:45.104812 | orchestrator | 2025-09-27 00:39:45.104823 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-09-27 00:39:45.104834 | orchestrator | Saturday 27 September 2025 00:39:42 +0000 (0:00:00.136) 0:00:11.968 **** 2025-09-27 00:39:45.104844 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104855 | orchestrator | 2025-09-27 00:39:45.104865 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-09-27 00:39:45.104876 | orchestrator | Saturday 27 September 2025 00:39:42 +0000 (0:00:00.138) 0:00:12.106 **** 2025-09-27 00:39:45.104892 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104903 | orchestrator | 2025-09-27 00:39:45.104914 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-09-27 00:39:45.104924 | orchestrator | Saturday 27 September 2025 00:39:42 +0000 (0:00:00.138) 0:00:12.245 **** 2025-09-27 00:39:45.104935 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:39:45.104945 | orchestrator | 2025-09-27 00:39:45.104956 | orchestrator | TASK [Print configuration data] ************************************************ 2025-09-27 00:39:45.104966 | orchestrator | Saturday 27 September 2025 00:39:42 +0000 (0:00:00.141) 0:00:12.386 **** 2025-09-27 00:39:45.104977 | orchestrator | changed: [testbed-node-3] => { 2025-09-27 00:39:45.104987 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-09-27 00:39:45.104998 | orchestrator |  "ceph_osd_devices": { 2025-09-27 00:39:45.105009 | orchestrator |  "sdb": { 2025-09-27 00:39:45.105019 | orchestrator |  "osd_lvm_uuid": "025d8a54-72cd-5dfc-843f-2890244ba468" 2025-09-27 00:39:45.105030 | orchestrator |  }, 2025-09-27 00:39:45.105041 | orchestrator |  "sdc": { 2025-09-27 00:39:45.105051 | orchestrator |  "osd_lvm_uuid": "9ca7935d-e986-5962-b530-505e6c7ac609" 2025-09-27 00:39:45.105062 | orchestrator |  } 2025-09-27 00:39:45.105073 | orchestrator |  }, 2025-09-27 00:39:45.105083 | orchestrator |  "lvm_volumes": [ 2025-09-27 00:39:45.105094 | orchestrator |  { 2025-09-27 00:39:45.105104 | orchestrator |  "data": "osd-block-025d8a54-72cd-5dfc-843f-2890244ba468", 2025-09-27 00:39:45.105115 | orchestrator |  "data_vg": "ceph-025d8a54-72cd-5dfc-843f-2890244ba468" 2025-09-27 00:39:45.105125 | orchestrator |  }, 2025-09-27 00:39:45.105136 | orchestrator |  { 2025-09-27 00:39:45.105146 | orchestrator |  "data": "osd-block-9ca7935d-e986-5962-b530-505e6c7ac609", 2025-09-27 00:39:45.105157 | orchestrator |  "data_vg": "ceph-9ca7935d-e986-5962-b530-505e6c7ac609" 2025-09-27 00:39:45.105167 | orchestrator |  } 2025-09-27 00:39:45.105178 | orchestrator |  ] 2025-09-27 00:39:45.105189 | orchestrator |  } 2025-09-27 00:39:45.105199 | orchestrator | } 2025-09-27 00:39:45.105226 | orchestrator | 2025-09-27 00:39:45.105236 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-09-27 00:39:45.105254 | orchestrator | Saturday 27 September 2025 00:39:42 +0000 (0:00:00.377) 0:00:12.763 **** 2025-09-27 00:39:45.105264 | orchestrator | changed: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-09-27 00:39:45.105275 | orchestrator | 2025-09-27 00:39:45.105286 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-09-27 00:39:45.105297 | orchestrator | 2025-09-27 00:39:45.105307 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-27 00:39:45.105318 | orchestrator | Saturday 27 September 2025 00:39:44 +0000 (0:00:01.771) 0:00:14.535 **** 2025-09-27 00:39:45.105328 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-09-27 00:39:45.105339 | orchestrator | 2025-09-27 00:39:45.105349 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-27 00:39:45.105360 | orchestrator | Saturday 27 September 2025 00:39:44 +0000 (0:00:00.248) 0:00:14.783 **** 2025-09-27 00:39:45.105371 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:39:45.105381 | orchestrator | 2025-09-27 00:39:45.105392 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:45.105409 | orchestrator | Saturday 27 September 2025 00:39:45 +0000 (0:00:00.247) 0:00:15.031 **** 2025-09-27 00:39:52.647681 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2025-09-27 00:39:52.647791 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2025-09-27 00:39:52.647807 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2025-09-27 00:39:52.647819 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2025-09-27 00:39:52.647830 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2025-09-27 00:39:52.647842 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2025-09-27 00:39:52.647852 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2025-09-27 00:39:52.647863 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2025-09-27 00:39:52.647874 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2025-09-27 00:39:52.647885 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2025-09-27 00:39:52.647915 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2025-09-27 00:39:52.647927 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2025-09-27 00:39:52.647938 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2025-09-27 00:39:52.647953 | orchestrator | 2025-09-27 00:39:52.647965 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.647977 | orchestrator | Saturday 27 September 2025 00:39:45 +0000 (0:00:00.378) 0:00:15.410 **** 2025-09-27 00:39:52.647988 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648000 | orchestrator | 2025-09-27 00:39:52.648011 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648021 | orchestrator | Saturday 27 September 2025 00:39:45 +0000 (0:00:00.208) 0:00:15.618 **** 2025-09-27 00:39:52.648032 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648043 | orchestrator | 2025-09-27 00:39:52.648053 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648064 | orchestrator | Saturday 27 September 2025 00:39:45 +0000 (0:00:00.184) 0:00:15.802 **** 2025-09-27 00:39:52.648074 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648085 | orchestrator | 2025-09-27 00:39:52.648096 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648106 | orchestrator | Saturday 27 September 2025 00:39:46 +0000 (0:00:00.205) 0:00:16.008 **** 2025-09-27 00:39:52.648117 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648149 | orchestrator | 2025-09-27 00:39:52.648161 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648171 | orchestrator | Saturday 27 September 2025 00:39:46 +0000 (0:00:00.195) 0:00:16.203 **** 2025-09-27 00:39:52.648182 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648192 | orchestrator | 2025-09-27 00:39:52.648231 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648245 | orchestrator | Saturday 27 September 2025 00:39:46 +0000 (0:00:00.560) 0:00:16.764 **** 2025-09-27 00:39:52.648257 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648269 | orchestrator | 2025-09-27 00:39:52.648281 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648294 | orchestrator | Saturday 27 September 2025 00:39:47 +0000 (0:00:00.182) 0:00:16.946 **** 2025-09-27 00:39:52.648306 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648318 | orchestrator | 2025-09-27 00:39:52.648330 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648342 | orchestrator | Saturday 27 September 2025 00:39:47 +0000 (0:00:00.192) 0:00:17.138 **** 2025-09-27 00:39:52.648355 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648366 | orchestrator | 2025-09-27 00:39:52.648378 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648390 | orchestrator | Saturday 27 September 2025 00:39:47 +0000 (0:00:00.187) 0:00:17.325 **** 2025-09-27 00:39:52.648402 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2) 2025-09-27 00:39:52.648415 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2) 2025-09-27 00:39:52.648427 | orchestrator | 2025-09-27 00:39:52.648439 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648452 | orchestrator | Saturday 27 September 2025 00:39:47 +0000 (0:00:00.401) 0:00:17.727 **** 2025-09-27 00:39:52.648464 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b) 2025-09-27 00:39:52.648477 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b) 2025-09-27 00:39:52.648489 | orchestrator | 2025-09-27 00:39:52.648501 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648513 | orchestrator | Saturday 27 September 2025 00:39:48 +0000 (0:00:00.391) 0:00:18.119 **** 2025-09-27 00:39:52.648526 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766) 2025-09-27 00:39:52.648538 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766) 2025-09-27 00:39:52.648551 | orchestrator | 2025-09-27 00:39:52.648563 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648574 | orchestrator | Saturday 27 September 2025 00:39:48 +0000 (0:00:00.411) 0:00:18.530 **** 2025-09-27 00:39:52.648602 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408) 2025-09-27 00:39:52.648614 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408) 2025-09-27 00:39:52.648626 | orchestrator | 2025-09-27 00:39:52.648636 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:52.648647 | orchestrator | Saturday 27 September 2025 00:39:49 +0000 (0:00:00.420) 0:00:18.951 **** 2025-09-27 00:39:52.648658 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-27 00:39:52.648668 | orchestrator | 2025-09-27 00:39:52.648679 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.648696 | orchestrator | Saturday 27 September 2025 00:39:49 +0000 (0:00:00.314) 0:00:19.265 **** 2025-09-27 00:39:52.648707 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2025-09-27 00:39:52.648725 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2025-09-27 00:39:52.648736 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2025-09-27 00:39:52.648746 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2025-09-27 00:39:52.648757 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2025-09-27 00:39:52.648768 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2025-09-27 00:39:52.648778 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2025-09-27 00:39:52.648789 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2025-09-27 00:39:52.648799 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2025-09-27 00:39:52.648810 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2025-09-27 00:39:52.648820 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2025-09-27 00:39:52.648831 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2025-09-27 00:39:52.648841 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2025-09-27 00:39:52.648852 | orchestrator | 2025-09-27 00:39:52.648863 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.648873 | orchestrator | Saturday 27 September 2025 00:39:49 +0000 (0:00:00.380) 0:00:19.646 **** 2025-09-27 00:39:52.648884 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648894 | orchestrator | 2025-09-27 00:39:52.648905 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.648916 | orchestrator | Saturday 27 September 2025 00:39:49 +0000 (0:00:00.201) 0:00:19.847 **** 2025-09-27 00:39:52.648926 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648937 | orchestrator | 2025-09-27 00:39:52.648947 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.648958 | orchestrator | Saturday 27 September 2025 00:39:50 +0000 (0:00:00.615) 0:00:20.463 **** 2025-09-27 00:39:52.648968 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.648979 | orchestrator | 2025-09-27 00:39:52.648989 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.649000 | orchestrator | Saturday 27 September 2025 00:39:50 +0000 (0:00:00.184) 0:00:20.647 **** 2025-09-27 00:39:52.649011 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.649021 | orchestrator | 2025-09-27 00:39:52.649032 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.649043 | orchestrator | Saturday 27 September 2025 00:39:50 +0000 (0:00:00.142) 0:00:20.790 **** 2025-09-27 00:39:52.649053 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.649064 | orchestrator | 2025-09-27 00:39:52.649075 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.649085 | orchestrator | Saturday 27 September 2025 00:39:51 +0000 (0:00:00.186) 0:00:20.976 **** 2025-09-27 00:39:52.649096 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.649106 | orchestrator | 2025-09-27 00:39:52.649117 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.649128 | orchestrator | Saturday 27 September 2025 00:39:51 +0000 (0:00:00.170) 0:00:21.147 **** 2025-09-27 00:39:52.649138 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.649149 | orchestrator | 2025-09-27 00:39:52.649159 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.649170 | orchestrator | Saturday 27 September 2025 00:39:51 +0000 (0:00:00.222) 0:00:21.369 **** 2025-09-27 00:39:52.649181 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.649191 | orchestrator | 2025-09-27 00:39:52.649230 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.649248 | orchestrator | Saturday 27 September 2025 00:39:51 +0000 (0:00:00.203) 0:00:21.573 **** 2025-09-27 00:39:52.649259 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2025-09-27 00:39:52.649270 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2025-09-27 00:39:52.649281 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2025-09-27 00:39:52.649292 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2025-09-27 00:39:52.649303 | orchestrator | 2025-09-27 00:39:52.649313 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:52.649324 | orchestrator | Saturday 27 September 2025 00:39:52 +0000 (0:00:00.791) 0:00:22.365 **** 2025-09-27 00:39:52.649335 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:52.649346 | orchestrator | 2025-09-27 00:39:52.649363 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:58.209692 | orchestrator | Saturday 27 September 2025 00:39:52 +0000 (0:00:00.212) 0:00:22.577 **** 2025-09-27 00:39:58.209800 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.209817 | orchestrator | 2025-09-27 00:39:58.209830 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:58.209842 | orchestrator | Saturday 27 September 2025 00:39:52 +0000 (0:00:00.169) 0:00:22.746 **** 2025-09-27 00:39:58.209853 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.209864 | orchestrator | 2025-09-27 00:39:58.209876 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:39:58.209887 | orchestrator | Saturday 27 September 2025 00:39:52 +0000 (0:00:00.159) 0:00:22.906 **** 2025-09-27 00:39:58.209898 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.209910 | orchestrator | 2025-09-27 00:39:58.209940 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-09-27 00:39:58.209952 | orchestrator | Saturday 27 September 2025 00:39:53 +0000 (0:00:00.176) 0:00:23.083 **** 2025-09-27 00:39:58.209963 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': None}) 2025-09-27 00:39:58.209974 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': None}) 2025-09-27 00:39:58.209985 | orchestrator | 2025-09-27 00:39:58.209996 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-09-27 00:39:58.210007 | orchestrator | Saturday 27 September 2025 00:39:53 +0000 (0:00:00.305) 0:00:23.389 **** 2025-09-27 00:39:58.210081 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210093 | orchestrator | 2025-09-27 00:39:58.210105 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-09-27 00:39:58.210116 | orchestrator | Saturday 27 September 2025 00:39:53 +0000 (0:00:00.116) 0:00:23.505 **** 2025-09-27 00:39:58.210127 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210138 | orchestrator | 2025-09-27 00:39:58.210149 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-09-27 00:39:58.210160 | orchestrator | Saturday 27 September 2025 00:39:53 +0000 (0:00:00.112) 0:00:23.617 **** 2025-09-27 00:39:58.210171 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210182 | orchestrator | 2025-09-27 00:39:58.210193 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-09-27 00:39:58.210250 | orchestrator | Saturday 27 September 2025 00:39:53 +0000 (0:00:00.118) 0:00:23.736 **** 2025-09-27 00:39:58.210265 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:39:58.210278 | orchestrator | 2025-09-27 00:39:58.210291 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-09-27 00:39:58.210303 | orchestrator | Saturday 27 September 2025 00:39:53 +0000 (0:00:00.122) 0:00:23.858 **** 2025-09-27 00:39:58.210316 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e62f59a6-4044-5e93-b85c-9f8cca280e9f'}}) 2025-09-27 00:39:58.210329 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '634a63d2-bd22-5328-9676-28392545ed43'}}) 2025-09-27 00:39:58.210342 | orchestrator | 2025-09-27 00:39:58.210354 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-09-27 00:39:58.210387 | orchestrator | Saturday 27 September 2025 00:39:54 +0000 (0:00:00.153) 0:00:24.012 **** 2025-09-27 00:39:58.210401 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e62f59a6-4044-5e93-b85c-9f8cca280e9f'}})  2025-09-27 00:39:58.210415 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '634a63d2-bd22-5328-9676-28392545ed43'}})  2025-09-27 00:39:58.210428 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210440 | orchestrator | 2025-09-27 00:39:58.210453 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-09-27 00:39:58.210465 | orchestrator | Saturday 27 September 2025 00:39:54 +0000 (0:00:00.135) 0:00:24.148 **** 2025-09-27 00:39:58.210477 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e62f59a6-4044-5e93-b85c-9f8cca280e9f'}})  2025-09-27 00:39:58.210489 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '634a63d2-bd22-5328-9676-28392545ed43'}})  2025-09-27 00:39:58.210502 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210514 | orchestrator | 2025-09-27 00:39:58.210526 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-09-27 00:39:58.210538 | orchestrator | Saturday 27 September 2025 00:39:54 +0000 (0:00:00.130) 0:00:24.279 **** 2025-09-27 00:39:58.210551 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e62f59a6-4044-5e93-b85c-9f8cca280e9f'}})  2025-09-27 00:39:58.210569 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '634a63d2-bd22-5328-9676-28392545ed43'}})  2025-09-27 00:39:58.210588 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210605 | orchestrator | 2025-09-27 00:39:58.210624 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-09-27 00:39:58.210644 | orchestrator | Saturday 27 September 2025 00:39:54 +0000 (0:00:00.132) 0:00:24.411 **** 2025-09-27 00:39:58.210662 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:39:58.210678 | orchestrator | 2025-09-27 00:39:58.210690 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-09-27 00:39:58.210701 | orchestrator | Saturday 27 September 2025 00:39:54 +0000 (0:00:00.119) 0:00:24.531 **** 2025-09-27 00:39:58.210711 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:39:58.210722 | orchestrator | 2025-09-27 00:39:58.210733 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-09-27 00:39:58.210743 | orchestrator | Saturday 27 September 2025 00:39:54 +0000 (0:00:00.118) 0:00:24.649 **** 2025-09-27 00:39:58.210754 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210765 | orchestrator | 2025-09-27 00:39:58.210794 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-09-27 00:39:58.210806 | orchestrator | Saturday 27 September 2025 00:39:54 +0000 (0:00:00.114) 0:00:24.764 **** 2025-09-27 00:39:58.210816 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210827 | orchestrator | 2025-09-27 00:39:58.210838 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-09-27 00:39:58.210849 | orchestrator | Saturday 27 September 2025 00:39:55 +0000 (0:00:00.256) 0:00:25.020 **** 2025-09-27 00:39:58.210860 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.210870 | orchestrator | 2025-09-27 00:39:58.210881 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-09-27 00:39:58.210892 | orchestrator | Saturday 27 September 2025 00:39:55 +0000 (0:00:00.116) 0:00:25.137 **** 2025-09-27 00:39:58.210903 | orchestrator | ok: [testbed-node-4] => { 2025-09-27 00:39:58.210914 | orchestrator |  "ceph_osd_devices": { 2025-09-27 00:39:58.210925 | orchestrator |  "sdb": { 2025-09-27 00:39:58.210937 | orchestrator |  "osd_lvm_uuid": "e62f59a6-4044-5e93-b85c-9f8cca280e9f" 2025-09-27 00:39:58.210948 | orchestrator |  }, 2025-09-27 00:39:58.210959 | orchestrator |  "sdc": { 2025-09-27 00:39:58.210979 | orchestrator |  "osd_lvm_uuid": "634a63d2-bd22-5328-9676-28392545ed43" 2025-09-27 00:39:58.210990 | orchestrator |  } 2025-09-27 00:39:58.211000 | orchestrator |  } 2025-09-27 00:39:58.211011 | orchestrator | } 2025-09-27 00:39:58.211022 | orchestrator | 2025-09-27 00:39:58.211033 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-09-27 00:39:58.211044 | orchestrator | Saturday 27 September 2025 00:39:55 +0000 (0:00:00.143) 0:00:25.281 **** 2025-09-27 00:39:58.211055 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.211066 | orchestrator | 2025-09-27 00:39:58.211084 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-09-27 00:39:58.211095 | orchestrator | Saturday 27 September 2025 00:39:55 +0000 (0:00:00.117) 0:00:25.398 **** 2025-09-27 00:39:58.211106 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.211117 | orchestrator | 2025-09-27 00:39:58.211127 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-09-27 00:39:58.211138 | orchestrator | Saturday 27 September 2025 00:39:55 +0000 (0:00:00.119) 0:00:25.517 **** 2025-09-27 00:39:58.211149 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:39:58.211159 | orchestrator | 2025-09-27 00:39:58.211170 | orchestrator | TASK [Print configuration data] ************************************************ 2025-09-27 00:39:58.211181 | orchestrator | Saturday 27 September 2025 00:39:55 +0000 (0:00:00.149) 0:00:25.667 **** 2025-09-27 00:39:58.211192 | orchestrator | changed: [testbed-node-4] => { 2025-09-27 00:39:58.211223 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-09-27 00:39:58.211235 | orchestrator |  "ceph_osd_devices": { 2025-09-27 00:39:58.211246 | orchestrator |  "sdb": { 2025-09-27 00:39:58.211257 | orchestrator |  "osd_lvm_uuid": "e62f59a6-4044-5e93-b85c-9f8cca280e9f" 2025-09-27 00:39:58.211273 | orchestrator |  }, 2025-09-27 00:39:58.211284 | orchestrator |  "sdc": { 2025-09-27 00:39:58.211295 | orchestrator |  "osd_lvm_uuid": "634a63d2-bd22-5328-9676-28392545ed43" 2025-09-27 00:39:58.211306 | orchestrator |  } 2025-09-27 00:39:58.211317 | orchestrator |  }, 2025-09-27 00:39:58.211328 | orchestrator |  "lvm_volumes": [ 2025-09-27 00:39:58.211339 | orchestrator |  { 2025-09-27 00:39:58.211350 | orchestrator |  "data": "osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f", 2025-09-27 00:39:58.211361 | orchestrator |  "data_vg": "ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f" 2025-09-27 00:39:58.211371 | orchestrator |  }, 2025-09-27 00:39:58.211382 | orchestrator |  { 2025-09-27 00:39:58.211393 | orchestrator |  "data": "osd-block-634a63d2-bd22-5328-9676-28392545ed43", 2025-09-27 00:39:58.211404 | orchestrator |  "data_vg": "ceph-634a63d2-bd22-5328-9676-28392545ed43" 2025-09-27 00:39:58.211415 | orchestrator |  } 2025-09-27 00:39:58.211426 | orchestrator |  ] 2025-09-27 00:39:58.211436 | orchestrator |  } 2025-09-27 00:39:58.211447 | orchestrator | } 2025-09-27 00:39:58.211458 | orchestrator | 2025-09-27 00:39:58.211469 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-09-27 00:39:58.211480 | orchestrator | Saturday 27 September 2025 00:39:55 +0000 (0:00:00.223) 0:00:25.891 **** 2025-09-27 00:39:58.211490 | orchestrator | changed: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-09-27 00:39:58.211501 | orchestrator | 2025-09-27 00:39:58.211512 | orchestrator | PLAY [Ceph configure LVM] ****************************************************** 2025-09-27 00:39:58.211523 | orchestrator | 2025-09-27 00:39:58.211534 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-27 00:39:58.211544 | orchestrator | Saturday 27 September 2025 00:39:56 +0000 (0:00:00.935) 0:00:26.826 **** 2025-09-27 00:39:58.211555 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-09-27 00:39:58.211566 | orchestrator | 2025-09-27 00:39:58.211577 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-27 00:39:58.211587 | orchestrator | Saturday 27 September 2025 00:39:57 +0000 (0:00:00.466) 0:00:27.293 **** 2025-09-27 00:39:58.211605 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:39:58.211616 | orchestrator | 2025-09-27 00:39:58.211627 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:39:58.211638 | orchestrator | Saturday 27 September 2025 00:39:57 +0000 (0:00:00.476) 0:00:27.769 **** 2025-09-27 00:39:58.211649 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2025-09-27 00:39:58.211660 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2025-09-27 00:39:58.211670 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2025-09-27 00:39:58.211681 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2025-09-27 00:39:58.211692 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2025-09-27 00:39:58.211703 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2025-09-27 00:39:58.211720 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2025-09-27 00:40:05.550936 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2025-09-27 00:40:05.551033 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2025-09-27 00:40:05.551049 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2025-09-27 00:40:05.551061 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2025-09-27 00:40:05.551072 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2025-09-27 00:40:05.551083 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2025-09-27 00:40:05.551094 | orchestrator | 2025-09-27 00:40:05.551107 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551118 | orchestrator | Saturday 27 September 2025 00:39:58 +0000 (0:00:00.357) 0:00:28.127 **** 2025-09-27 00:40:05.551129 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.551141 | orchestrator | 2025-09-27 00:40:05.551152 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551163 | orchestrator | Saturday 27 September 2025 00:39:58 +0000 (0:00:00.246) 0:00:28.373 **** 2025-09-27 00:40:05.551173 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.551185 | orchestrator | 2025-09-27 00:40:05.551196 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551238 | orchestrator | Saturday 27 September 2025 00:39:58 +0000 (0:00:00.194) 0:00:28.568 **** 2025-09-27 00:40:05.551249 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.551260 | orchestrator | 2025-09-27 00:40:05.551271 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551282 | orchestrator | Saturday 27 September 2025 00:39:58 +0000 (0:00:00.176) 0:00:28.745 **** 2025-09-27 00:40:05.551292 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.551303 | orchestrator | 2025-09-27 00:40:05.551314 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551325 | orchestrator | Saturday 27 September 2025 00:39:58 +0000 (0:00:00.170) 0:00:28.915 **** 2025-09-27 00:40:05.551336 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.551347 | orchestrator | 2025-09-27 00:40:05.551358 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551368 | orchestrator | Saturday 27 September 2025 00:39:59 +0000 (0:00:00.199) 0:00:29.115 **** 2025-09-27 00:40:05.551379 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.551390 | orchestrator | 2025-09-27 00:40:05.551401 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551412 | orchestrator | Saturday 27 September 2025 00:39:59 +0000 (0:00:00.176) 0:00:29.291 **** 2025-09-27 00:40:05.551423 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.551454 | orchestrator | 2025-09-27 00:40:05.551466 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551477 | orchestrator | Saturday 27 September 2025 00:39:59 +0000 (0:00:00.172) 0:00:29.464 **** 2025-09-27 00:40:05.551487 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.551501 | orchestrator | 2025-09-27 00:40:05.551527 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551540 | orchestrator | Saturday 27 September 2025 00:39:59 +0000 (0:00:00.153) 0:00:29.618 **** 2025-09-27 00:40:05.551554 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0) 2025-09-27 00:40:05.551567 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0) 2025-09-27 00:40:05.551580 | orchestrator | 2025-09-27 00:40:05.551593 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551605 | orchestrator | Saturday 27 September 2025 00:40:00 +0000 (0:00:00.527) 0:00:30.146 **** 2025-09-27 00:40:05.551618 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d) 2025-09-27 00:40:05.551630 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d) 2025-09-27 00:40:05.551643 | orchestrator | 2025-09-27 00:40:05.551656 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551669 | orchestrator | Saturday 27 September 2025 00:40:00 +0000 (0:00:00.736) 0:00:30.882 **** 2025-09-27 00:40:05.551681 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f) 2025-09-27 00:40:05.551694 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f) 2025-09-27 00:40:05.551706 | orchestrator | 2025-09-27 00:40:05.551719 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551731 | orchestrator | Saturday 27 September 2025 00:40:01 +0000 (0:00:00.332) 0:00:31.215 **** 2025-09-27 00:40:05.551743 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706) 2025-09-27 00:40:05.551756 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706) 2025-09-27 00:40:05.551768 | orchestrator | 2025-09-27 00:40:05.551781 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:40:05.551794 | orchestrator | Saturday 27 September 2025 00:40:01 +0000 (0:00:00.409) 0:00:31.624 **** 2025-09-27 00:40:05.551806 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-27 00:40:05.551819 | orchestrator | 2025-09-27 00:40:05.551832 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.551844 | orchestrator | Saturday 27 September 2025 00:40:02 +0000 (0:00:00.316) 0:00:31.940 **** 2025-09-27 00:40:05.551872 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2025-09-27 00:40:05.551884 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2025-09-27 00:40:05.551895 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2025-09-27 00:40:05.551906 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2025-09-27 00:40:05.551917 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2025-09-27 00:40:05.551927 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2025-09-27 00:40:05.551938 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2025-09-27 00:40:05.551949 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2025-09-27 00:40:05.551960 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2025-09-27 00:40:05.551978 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2025-09-27 00:40:05.551989 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2025-09-27 00:40:05.552000 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2025-09-27 00:40:05.552011 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2025-09-27 00:40:05.552021 | orchestrator | 2025-09-27 00:40:05.552032 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552043 | orchestrator | Saturday 27 September 2025 00:40:02 +0000 (0:00:00.386) 0:00:32.326 **** 2025-09-27 00:40:05.552053 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552064 | orchestrator | 2025-09-27 00:40:05.552075 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552086 | orchestrator | Saturday 27 September 2025 00:40:02 +0000 (0:00:00.212) 0:00:32.539 **** 2025-09-27 00:40:05.552096 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552107 | orchestrator | 2025-09-27 00:40:05.552117 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552128 | orchestrator | Saturday 27 September 2025 00:40:02 +0000 (0:00:00.198) 0:00:32.737 **** 2025-09-27 00:40:05.552139 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552149 | orchestrator | 2025-09-27 00:40:05.552160 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552171 | orchestrator | Saturday 27 September 2025 00:40:03 +0000 (0:00:00.199) 0:00:32.937 **** 2025-09-27 00:40:05.552181 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552192 | orchestrator | 2025-09-27 00:40:05.552221 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552232 | orchestrator | Saturday 27 September 2025 00:40:03 +0000 (0:00:00.188) 0:00:33.125 **** 2025-09-27 00:40:05.552243 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552254 | orchestrator | 2025-09-27 00:40:05.552264 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552275 | orchestrator | Saturday 27 September 2025 00:40:03 +0000 (0:00:00.174) 0:00:33.299 **** 2025-09-27 00:40:05.552286 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552297 | orchestrator | 2025-09-27 00:40:05.552308 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552319 | orchestrator | Saturday 27 September 2025 00:40:03 +0000 (0:00:00.508) 0:00:33.808 **** 2025-09-27 00:40:05.552329 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552340 | orchestrator | 2025-09-27 00:40:05.552351 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552361 | orchestrator | Saturday 27 September 2025 00:40:04 +0000 (0:00:00.179) 0:00:33.988 **** 2025-09-27 00:40:05.552372 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552383 | orchestrator | 2025-09-27 00:40:05.552393 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552404 | orchestrator | Saturday 27 September 2025 00:40:04 +0000 (0:00:00.168) 0:00:34.156 **** 2025-09-27 00:40:05.552415 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2025-09-27 00:40:05.552425 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2025-09-27 00:40:05.552436 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2025-09-27 00:40:05.552447 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2025-09-27 00:40:05.552458 | orchestrator | 2025-09-27 00:40:05.552468 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552479 | orchestrator | Saturday 27 September 2025 00:40:04 +0000 (0:00:00.612) 0:00:34.769 **** 2025-09-27 00:40:05.552490 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552501 | orchestrator | 2025-09-27 00:40:05.552512 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552529 | orchestrator | Saturday 27 September 2025 00:40:05 +0000 (0:00:00.199) 0:00:34.968 **** 2025-09-27 00:40:05.552539 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552550 | orchestrator | 2025-09-27 00:40:05.552561 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552572 | orchestrator | Saturday 27 September 2025 00:40:05 +0000 (0:00:00.182) 0:00:35.150 **** 2025-09-27 00:40:05.552582 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552593 | orchestrator | 2025-09-27 00:40:05.552604 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:40:05.552615 | orchestrator | Saturday 27 September 2025 00:40:05 +0000 (0:00:00.184) 0:00:35.335 **** 2025-09-27 00:40:05.552631 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:05.552642 | orchestrator | 2025-09-27 00:40:05.552653 | orchestrator | TASK [Set UUIDs for OSD VGs/LVs] *********************************************** 2025-09-27 00:40:05.552670 | orchestrator | Saturday 27 September 2025 00:40:05 +0000 (0:00:00.144) 0:00:35.479 **** 2025-09-27 00:40:09.409782 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': None}) 2025-09-27 00:40:09.409871 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': None}) 2025-09-27 00:40:09.409887 | orchestrator | 2025-09-27 00:40:09.409900 | orchestrator | TASK [Generate WAL VG names] *************************************************** 2025-09-27 00:40:09.409911 | orchestrator | Saturday 27 September 2025 00:40:05 +0000 (0:00:00.143) 0:00:35.623 **** 2025-09-27 00:40:09.409922 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.409933 | orchestrator | 2025-09-27 00:40:09.409944 | orchestrator | TASK [Generate DB VG names] **************************************************** 2025-09-27 00:40:09.409955 | orchestrator | Saturday 27 September 2025 00:40:05 +0000 (0:00:00.121) 0:00:35.744 **** 2025-09-27 00:40:09.409965 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.409976 | orchestrator | 2025-09-27 00:40:09.409987 | orchestrator | TASK [Generate shared DB/WAL VG names] ***************************************** 2025-09-27 00:40:09.409997 | orchestrator | Saturday 27 September 2025 00:40:05 +0000 (0:00:00.156) 0:00:35.901 **** 2025-09-27 00:40:09.410008 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410068 | orchestrator | 2025-09-27 00:40:09.410079 | orchestrator | TASK [Define lvm_volumes structures] ******************************************* 2025-09-27 00:40:09.410090 | orchestrator | Saturday 27 September 2025 00:40:06 +0000 (0:00:00.126) 0:00:36.028 **** 2025-09-27 00:40:09.410101 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:40:09.410112 | orchestrator | 2025-09-27 00:40:09.410123 | orchestrator | TASK [Generate lvm_volumes structure (block only)] ***************************** 2025-09-27 00:40:09.410133 | orchestrator | Saturday 27 September 2025 00:40:06 +0000 (0:00:00.249) 0:00:36.278 **** 2025-09-27 00:40:09.410145 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}}) 2025-09-27 00:40:09.410157 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '26537eb5-d37a-51fe-a7ad-0ae3582304de'}}) 2025-09-27 00:40:09.410167 | orchestrator | 2025-09-27 00:40:09.410178 | orchestrator | TASK [Generate lvm_volumes structure (block + db)] ***************************** 2025-09-27 00:40:09.410189 | orchestrator | Saturday 27 September 2025 00:40:06 +0000 (0:00:00.148) 0:00:36.426 **** 2025-09-27 00:40:09.410200 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}})  2025-09-27 00:40:09.410258 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '26537eb5-d37a-51fe-a7ad-0ae3582304de'}})  2025-09-27 00:40:09.410270 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410281 | orchestrator | 2025-09-27 00:40:09.410306 | orchestrator | TASK [Generate lvm_volumes structure (block + wal)] **************************** 2025-09-27 00:40:09.410318 | orchestrator | Saturday 27 September 2025 00:40:06 +0000 (0:00:00.130) 0:00:36.556 **** 2025-09-27 00:40:09.410329 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}})  2025-09-27 00:40:09.410364 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '26537eb5-d37a-51fe-a7ad-0ae3582304de'}})  2025-09-27 00:40:09.410377 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410389 | orchestrator | 2025-09-27 00:40:09.410401 | orchestrator | TASK [Generate lvm_volumes structure (block + db + wal)] *********************** 2025-09-27 00:40:09.410414 | orchestrator | Saturday 27 September 2025 00:40:06 +0000 (0:00:00.131) 0:00:36.688 **** 2025-09-27 00:40:09.410426 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}})  2025-09-27 00:40:09.410439 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '26537eb5-d37a-51fe-a7ad-0ae3582304de'}})  2025-09-27 00:40:09.410451 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410463 | orchestrator | 2025-09-27 00:40:09.410475 | orchestrator | TASK [Compile lvm_volumes] ***************************************************** 2025-09-27 00:40:09.410489 | orchestrator | Saturday 27 September 2025 00:40:06 +0000 (0:00:00.130) 0:00:36.818 **** 2025-09-27 00:40:09.410501 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:40:09.410513 | orchestrator | 2025-09-27 00:40:09.410525 | orchestrator | TASK [Set OSD devices config data] ********************************************* 2025-09-27 00:40:09.410537 | orchestrator | Saturday 27 September 2025 00:40:07 +0000 (0:00:00.117) 0:00:36.936 **** 2025-09-27 00:40:09.410550 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:40:09.410562 | orchestrator | 2025-09-27 00:40:09.410574 | orchestrator | TASK [Set DB devices config data] ********************************************** 2025-09-27 00:40:09.410586 | orchestrator | Saturday 27 September 2025 00:40:07 +0000 (0:00:00.123) 0:00:37.059 **** 2025-09-27 00:40:09.410598 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410610 | orchestrator | 2025-09-27 00:40:09.410622 | orchestrator | TASK [Set WAL devices config data] ********************************************* 2025-09-27 00:40:09.410634 | orchestrator | Saturday 27 September 2025 00:40:07 +0000 (0:00:00.122) 0:00:37.182 **** 2025-09-27 00:40:09.410646 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410659 | orchestrator | 2025-09-27 00:40:09.410671 | orchestrator | TASK [Set DB+WAL devices config data] ****************************************** 2025-09-27 00:40:09.410684 | orchestrator | Saturday 27 September 2025 00:40:07 +0000 (0:00:00.130) 0:00:37.312 **** 2025-09-27 00:40:09.410696 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410706 | orchestrator | 2025-09-27 00:40:09.410717 | orchestrator | TASK [Print ceph_osd_devices] ************************************************** 2025-09-27 00:40:09.410728 | orchestrator | Saturday 27 September 2025 00:40:07 +0000 (0:00:00.136) 0:00:37.449 **** 2025-09-27 00:40:09.410739 | orchestrator | ok: [testbed-node-5] => { 2025-09-27 00:40:09.410750 | orchestrator |  "ceph_osd_devices": { 2025-09-27 00:40:09.410761 | orchestrator |  "sdb": { 2025-09-27 00:40:09.410773 | orchestrator |  "osd_lvm_uuid": "03e94b17-8e91-5aba-9ae0-0b9f0a63cf06" 2025-09-27 00:40:09.410799 | orchestrator |  }, 2025-09-27 00:40:09.410810 | orchestrator |  "sdc": { 2025-09-27 00:40:09.410822 | orchestrator |  "osd_lvm_uuid": "26537eb5-d37a-51fe-a7ad-0ae3582304de" 2025-09-27 00:40:09.410832 | orchestrator |  } 2025-09-27 00:40:09.410843 | orchestrator |  } 2025-09-27 00:40:09.410855 | orchestrator | } 2025-09-27 00:40:09.410866 | orchestrator | 2025-09-27 00:40:09.410877 | orchestrator | TASK [Print WAL devices] ******************************************************* 2025-09-27 00:40:09.410888 | orchestrator | Saturday 27 September 2025 00:40:07 +0000 (0:00:00.133) 0:00:37.582 **** 2025-09-27 00:40:09.410898 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410909 | orchestrator | 2025-09-27 00:40:09.410920 | orchestrator | TASK [Print DB devices] ******************************************************** 2025-09-27 00:40:09.410931 | orchestrator | Saturday 27 September 2025 00:40:07 +0000 (0:00:00.124) 0:00:37.707 **** 2025-09-27 00:40:09.410941 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.410952 | orchestrator | 2025-09-27 00:40:09.410963 | orchestrator | TASK [Print shared DB/WAL devices] ********************************************* 2025-09-27 00:40:09.410982 | orchestrator | Saturday 27 September 2025 00:40:08 +0000 (0:00:00.252) 0:00:37.960 **** 2025-09-27 00:40:09.410992 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:40:09.411003 | orchestrator | 2025-09-27 00:40:09.411014 | orchestrator | TASK [Print configuration data] ************************************************ 2025-09-27 00:40:09.411025 | orchestrator | Saturday 27 September 2025 00:40:08 +0000 (0:00:00.155) 0:00:38.116 **** 2025-09-27 00:40:09.411035 | orchestrator | changed: [testbed-node-5] => { 2025-09-27 00:40:09.411046 | orchestrator |  "_ceph_configure_lvm_config_data": { 2025-09-27 00:40:09.411057 | orchestrator |  "ceph_osd_devices": { 2025-09-27 00:40:09.411068 | orchestrator |  "sdb": { 2025-09-27 00:40:09.411079 | orchestrator |  "osd_lvm_uuid": "03e94b17-8e91-5aba-9ae0-0b9f0a63cf06" 2025-09-27 00:40:09.411090 | orchestrator |  }, 2025-09-27 00:40:09.411101 | orchestrator |  "sdc": { 2025-09-27 00:40:09.411112 | orchestrator |  "osd_lvm_uuid": "26537eb5-d37a-51fe-a7ad-0ae3582304de" 2025-09-27 00:40:09.411123 | orchestrator |  } 2025-09-27 00:40:09.411134 | orchestrator |  }, 2025-09-27 00:40:09.411145 | orchestrator |  "lvm_volumes": [ 2025-09-27 00:40:09.411156 | orchestrator |  { 2025-09-27 00:40:09.411167 | orchestrator |  "data": "osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06", 2025-09-27 00:40:09.411177 | orchestrator |  "data_vg": "ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06" 2025-09-27 00:40:09.411188 | orchestrator |  }, 2025-09-27 00:40:09.411199 | orchestrator |  { 2025-09-27 00:40:09.411229 | orchestrator |  "data": "osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de", 2025-09-27 00:40:09.411240 | orchestrator |  "data_vg": "ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de" 2025-09-27 00:40:09.411251 | orchestrator |  } 2025-09-27 00:40:09.411262 | orchestrator |  ] 2025-09-27 00:40:09.411272 | orchestrator |  } 2025-09-27 00:40:09.411287 | orchestrator | } 2025-09-27 00:40:09.411299 | orchestrator | 2025-09-27 00:40:09.411310 | orchestrator | RUNNING HANDLER [Write configuration file] ************************************* 2025-09-27 00:40:09.411320 | orchestrator | Saturday 27 September 2025 00:40:08 +0000 (0:00:00.235) 0:00:38.351 **** 2025-09-27 00:40:09.411331 | orchestrator | changed: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-09-27 00:40:09.411342 | orchestrator | 2025-09-27 00:40:09.411353 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:40:09.411371 | orchestrator | testbed-node-3 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-09-27 00:40:09.411383 | orchestrator | testbed-node-4 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-09-27 00:40:09.411394 | orchestrator | testbed-node-5 : ok=42  changed=2  unreachable=0 failed=0 skipped=32  rescued=0 ignored=0 2025-09-27 00:40:09.411405 | orchestrator | 2025-09-27 00:40:09.411416 | orchestrator | 2025-09-27 00:40:09.411427 | orchestrator | 2025-09-27 00:40:09.411438 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:40:09.411449 | orchestrator | Saturday 27 September 2025 00:40:09 +0000 (0:00:00.970) 0:00:39.322 **** 2025-09-27 00:40:09.411459 | orchestrator | =============================================================================== 2025-09-27 00:40:09.411470 | orchestrator | Write configuration file ------------------------------------------------ 3.68s 2025-09-27 00:40:09.411481 | orchestrator | Add known partitions to the list of available block devices ------------- 1.16s 2025-09-27 00:40:09.411491 | orchestrator | Add known links to the list of available block devices ------------------ 1.08s 2025-09-27 00:40:09.411502 | orchestrator | Add known partitions to the list of available block devices ------------- 0.99s 2025-09-27 00:40:09.411513 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.96s 2025-09-27 00:40:09.411530 | orchestrator | Get initial list of available block devices ----------------------------- 0.94s 2025-09-27 00:40:09.411541 | orchestrator | Print configuration data ------------------------------------------------ 0.84s 2025-09-27 00:40:09.411552 | orchestrator | Add known partitions to the list of available block devices ------------- 0.79s 2025-09-27 00:40:09.411563 | orchestrator | Add known links to the list of available block devices ------------------ 0.77s 2025-09-27 00:40:09.411574 | orchestrator | Add known links to the list of available block devices ------------------ 0.74s 2025-09-27 00:40:09.411585 | orchestrator | Generate lvm_volumes structure (block + wal) ---------------------------- 0.64s 2025-09-27 00:40:09.411595 | orchestrator | Set UUIDs for OSD VGs/LVs ----------------------------------------------- 0.62s 2025-09-27 00:40:09.411606 | orchestrator | Add known partitions to the list of available block devices ------------- 0.62s 2025-09-27 00:40:09.411617 | orchestrator | Add known partitions to the list of available block devices ------------- 0.61s 2025-09-27 00:40:09.411634 | orchestrator | Add known links to the list of available block devices ------------------ 0.61s 2025-09-27 00:40:09.623787 | orchestrator | Add known links to the list of available block devices ------------------ 0.60s 2025-09-27 00:40:09.623864 | orchestrator | Add known links to the list of available block devices ------------------ 0.56s 2025-09-27 00:40:09.623878 | orchestrator | Add known links to the list of available block devices ------------------ 0.53s 2025-09-27 00:40:09.623890 | orchestrator | Set WAL devices config data --------------------------------------------- 0.53s 2025-09-27 00:40:09.623901 | orchestrator | Print DB devices -------------------------------------------------------- 0.51s 2025-09-27 00:40:32.287608 | orchestrator | 2025-09-27 00:40:32 | INFO  | Task 188bccbc-3972-463a-a630-bbfc83f8a786 (sync inventory) is running in background. Output coming soon. 2025-09-27 00:40:55.459500 | orchestrator | 2025-09-27 00:40:33 | INFO  | Starting group_vars file reorganization 2025-09-27 00:40:55.459616 | orchestrator | 2025-09-27 00:40:33 | INFO  | Moved 0 file(s) to their respective directories 2025-09-27 00:40:55.459633 | orchestrator | 2025-09-27 00:40:33 | INFO  | Group_vars file reorganization completed 2025-09-27 00:40:55.459646 | orchestrator | 2025-09-27 00:40:36 | INFO  | Starting variable preparation from inventory 2025-09-27 00:40:55.459657 | orchestrator | 2025-09-27 00:40:39 | INFO  | Writing 050-kolla-ceph-rgw-hosts.yml with ceph_rgw_hosts 2025-09-27 00:40:55.459668 | orchestrator | 2025-09-27 00:40:39 | INFO  | Writing 050-infrastructure-cephclient-mons.yml with cephclient_mons 2025-09-27 00:40:55.459679 | orchestrator | 2025-09-27 00:40:39 | INFO  | Writing 050-ceph-cluster-fsid.yml with ceph_cluster_fsid 2025-09-27 00:40:55.459690 | orchestrator | 2025-09-27 00:40:39 | INFO  | 3 file(s) written, 6 host(s) processed 2025-09-27 00:40:55.459701 | orchestrator | 2025-09-27 00:40:39 | INFO  | Variable preparation completed 2025-09-27 00:40:55.459712 | orchestrator | 2025-09-27 00:40:40 | INFO  | Starting inventory overwrite handling 2025-09-27 00:40:55.459722 | orchestrator | 2025-09-27 00:40:40 | INFO  | Handling group overwrites in 99-overwrite 2025-09-27 00:40:55.459734 | orchestrator | 2025-09-27 00:40:40 | INFO  | Removing group frr:children from 60-generic 2025-09-27 00:40:55.459745 | orchestrator | 2025-09-27 00:40:40 | INFO  | Removing group storage:children from 50-kolla 2025-09-27 00:40:55.459755 | orchestrator | 2025-09-27 00:40:40 | INFO  | Removing group netbird:children from 50-infrastructure 2025-09-27 00:40:55.459766 | orchestrator | 2025-09-27 00:40:40 | INFO  | Removing group ceph-mds from 50-ceph 2025-09-27 00:40:55.459778 | orchestrator | 2025-09-27 00:40:40 | INFO  | Removing group ceph-rgw from 50-ceph 2025-09-27 00:40:55.459788 | orchestrator | 2025-09-27 00:40:40 | INFO  | Handling group overwrites in 20-roles 2025-09-27 00:40:55.459799 | orchestrator | 2025-09-27 00:40:40 | INFO  | Removing group k3s_node from 50-infrastructure 2025-09-27 00:40:55.459833 | orchestrator | 2025-09-27 00:40:40 | INFO  | Removed 6 group(s) in total 2025-09-27 00:40:55.459845 | orchestrator | 2025-09-27 00:40:40 | INFO  | Inventory overwrite handling completed 2025-09-27 00:40:55.459855 | orchestrator | 2025-09-27 00:40:41 | INFO  | Starting merge of inventory files 2025-09-27 00:40:55.459885 | orchestrator | 2025-09-27 00:40:41 | INFO  | Inventory files merged successfully 2025-09-27 00:40:55.459896 | orchestrator | 2025-09-27 00:40:44 | INFO  | Generating ClusterShell configuration from Ansible inventory 2025-09-27 00:40:55.459918 | orchestrator | 2025-09-27 00:40:54 | INFO  | Successfully wrote ClusterShell configuration 2025-09-27 00:40:55.459930 | orchestrator | [master c02f525] 2025-09-27-00-40 2025-09-27 00:40:55.459942 | orchestrator | 1 file changed, 30 insertions(+), 9 deletions(-) 2025-09-27 00:40:57.225844 | orchestrator | 2025-09-27 00:40:57 | INFO  | Task b4263ab6-d31b-401e-b574-ed6e1ddaee47 (ceph-create-lvm-devices) was prepared for execution. 2025-09-27 00:40:57.225959 | orchestrator | 2025-09-27 00:40:57 | INFO  | It takes a moment until task b4263ab6-d31b-401e-b574-ed6e1ddaee47 (ceph-create-lvm-devices) has been started and output is visible here. 2025-09-27 00:41:07.474348 | orchestrator | 2025-09-27 00:41:07.474465 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-09-27 00:41:07.474483 | orchestrator | 2025-09-27 00:41:07.474495 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-27 00:41:07.474507 | orchestrator | Saturday 27 September 2025 00:41:01 +0000 (0:00:00.285) 0:00:00.285 **** 2025-09-27 00:41:07.474519 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] 2025-09-27 00:41:07.474530 | orchestrator | 2025-09-27 00:41:07.474541 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-27 00:41:07.474552 | orchestrator | Saturday 27 September 2025 00:41:01 +0000 (0:00:00.213) 0:00:00.499 **** 2025-09-27 00:41:07.474563 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:07.474575 | orchestrator | 2025-09-27 00:41:07.474586 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.474597 | orchestrator | Saturday 27 September 2025 00:41:01 +0000 (0:00:00.176) 0:00:00.675 **** 2025-09-27 00:41:07.474608 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop0) 2025-09-27 00:41:07.474621 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop1) 2025-09-27 00:41:07.474632 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop2) 2025-09-27 00:41:07.474643 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop3) 2025-09-27 00:41:07.474654 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop4) 2025-09-27 00:41:07.474664 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop5) 2025-09-27 00:41:07.474675 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop6) 2025-09-27 00:41:07.474686 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=loop7) 2025-09-27 00:41:07.474697 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sda) 2025-09-27 00:41:07.474708 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdb) 2025-09-27 00:41:07.474718 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdc) 2025-09-27 00:41:07.474729 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sdd) 2025-09-27 00:41:07.474740 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-3 => (item=sr0) 2025-09-27 00:41:07.474750 | orchestrator | 2025-09-27 00:41:07.474761 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.474797 | orchestrator | Saturday 27 September 2025 00:41:01 +0000 (0:00:00.327) 0:00:01.003 **** 2025-09-27 00:41:07.474815 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.474835 | orchestrator | 2025-09-27 00:41:07.474851 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.474882 | orchestrator | Saturday 27 September 2025 00:41:02 +0000 (0:00:00.304) 0:00:01.307 **** 2025-09-27 00:41:07.474896 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.474909 | orchestrator | 2025-09-27 00:41:07.474921 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.474933 | orchestrator | Saturday 27 September 2025 00:41:02 +0000 (0:00:00.163) 0:00:01.471 **** 2025-09-27 00:41:07.474950 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.474963 | orchestrator | 2025-09-27 00:41:07.474975 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.474988 | orchestrator | Saturday 27 September 2025 00:41:02 +0000 (0:00:00.173) 0:00:01.645 **** 2025-09-27 00:41:07.475001 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475013 | orchestrator | 2025-09-27 00:41:07.475026 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475038 | orchestrator | Saturday 27 September 2025 00:41:02 +0000 (0:00:00.175) 0:00:01.820 **** 2025-09-27 00:41:07.475052 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475064 | orchestrator | 2025-09-27 00:41:07.475077 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475089 | orchestrator | Saturday 27 September 2025 00:41:02 +0000 (0:00:00.161) 0:00:01.982 **** 2025-09-27 00:41:07.475103 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475115 | orchestrator | 2025-09-27 00:41:07.475128 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475140 | orchestrator | Saturday 27 September 2025 00:41:03 +0000 (0:00:00.170) 0:00:02.153 **** 2025-09-27 00:41:07.475153 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475165 | orchestrator | 2025-09-27 00:41:07.475178 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475191 | orchestrator | Saturday 27 September 2025 00:41:03 +0000 (0:00:00.176) 0:00:02.330 **** 2025-09-27 00:41:07.475222 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475234 | orchestrator | 2025-09-27 00:41:07.475245 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475256 | orchestrator | Saturday 27 September 2025 00:41:03 +0000 (0:00:00.145) 0:00:02.475 **** 2025-09-27 00:41:07.475266 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a) 2025-09-27 00:41:07.475278 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a) 2025-09-27 00:41:07.475289 | orchestrator | 2025-09-27 00:41:07.475300 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475310 | orchestrator | Saturday 27 September 2025 00:41:03 +0000 (0:00:00.304) 0:00:02.780 **** 2025-09-27 00:41:07.475340 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e) 2025-09-27 00:41:07.475352 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e) 2025-09-27 00:41:07.475363 | orchestrator | 2025-09-27 00:41:07.475373 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475384 | orchestrator | Saturday 27 September 2025 00:41:04 +0000 (0:00:00.370) 0:00:03.151 **** 2025-09-27 00:41:07.475395 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696) 2025-09-27 00:41:07.475405 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696) 2025-09-27 00:41:07.475416 | orchestrator | 2025-09-27 00:41:07.475427 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475533 | orchestrator | Saturday 27 September 2025 00:41:04 +0000 (0:00:00.501) 0:00:03.652 **** 2025-09-27 00:41:07.475545 | orchestrator | ok: [testbed-node-3] => (item=scsi-0QEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc) 2025-09-27 00:41:07.475556 | orchestrator | ok: [testbed-node-3] => (item=scsi-SQEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc) 2025-09-27 00:41:07.475567 | orchestrator | 2025-09-27 00:41:07.475578 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:07.475588 | orchestrator | Saturday 27 September 2025 00:41:05 +0000 (0:00:00.503) 0:00:04.156 **** 2025-09-27 00:41:07.475599 | orchestrator | ok: [testbed-node-3] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-27 00:41:07.475610 | orchestrator | 2025-09-27 00:41:07.475621 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.475631 | orchestrator | Saturday 27 September 2025 00:41:05 +0000 (0:00:00.541) 0:00:04.698 **** 2025-09-27 00:41:07.475642 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop0) 2025-09-27 00:41:07.475652 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop1) 2025-09-27 00:41:07.475663 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop2) 2025-09-27 00:41:07.475673 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop3) 2025-09-27 00:41:07.475684 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop4) 2025-09-27 00:41:07.475695 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop5) 2025-09-27 00:41:07.475705 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop6) 2025-09-27 00:41:07.475716 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=loop7) 2025-09-27 00:41:07.475726 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sda) 2025-09-27 00:41:07.475737 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdb) 2025-09-27 00:41:07.475747 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdc) 2025-09-27 00:41:07.475757 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sdd) 2025-09-27 00:41:07.475768 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-3 => (item=sr0) 2025-09-27 00:41:07.475779 | orchestrator | 2025-09-27 00:41:07.475789 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.475800 | orchestrator | Saturday 27 September 2025 00:41:05 +0000 (0:00:00.368) 0:00:05.067 **** 2025-09-27 00:41:07.475811 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475821 | orchestrator | 2025-09-27 00:41:07.475832 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.475843 | orchestrator | Saturday 27 September 2025 00:41:06 +0000 (0:00:00.186) 0:00:05.254 **** 2025-09-27 00:41:07.475853 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475864 | orchestrator | 2025-09-27 00:41:07.475875 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.475885 | orchestrator | Saturday 27 September 2025 00:41:06 +0000 (0:00:00.179) 0:00:05.433 **** 2025-09-27 00:41:07.475896 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475907 | orchestrator | 2025-09-27 00:41:07.475917 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.475928 | orchestrator | Saturday 27 September 2025 00:41:06 +0000 (0:00:00.166) 0:00:05.600 **** 2025-09-27 00:41:07.475939 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475949 | orchestrator | 2025-09-27 00:41:07.475960 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.475977 | orchestrator | Saturday 27 September 2025 00:41:06 +0000 (0:00:00.192) 0:00:05.793 **** 2025-09-27 00:41:07.475988 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.475998 | orchestrator | 2025-09-27 00:41:07.476009 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.476019 | orchestrator | Saturday 27 September 2025 00:41:06 +0000 (0:00:00.184) 0:00:05.977 **** 2025-09-27 00:41:07.476030 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.476040 | orchestrator | 2025-09-27 00:41:07.476051 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.476062 | orchestrator | Saturday 27 September 2025 00:41:07 +0000 (0:00:00.188) 0:00:06.166 **** 2025-09-27 00:41:07.476072 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:07.476083 | orchestrator | 2025-09-27 00:41:07.476093 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:07.476104 | orchestrator | Saturday 27 September 2025 00:41:07 +0000 (0:00:00.185) 0:00:06.351 **** 2025-09-27 00:41:07.476123 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.447693 | orchestrator | 2025-09-27 00:41:15.447807 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:15.447824 | orchestrator | Saturday 27 September 2025 00:41:07 +0000 (0:00:00.199) 0:00:06.551 **** 2025-09-27 00:41:15.447836 | orchestrator | ok: [testbed-node-3] => (item=sda1) 2025-09-27 00:41:15.447849 | orchestrator | ok: [testbed-node-3] => (item=sda14) 2025-09-27 00:41:15.447860 | orchestrator | ok: [testbed-node-3] => (item=sda15) 2025-09-27 00:41:15.447871 | orchestrator | ok: [testbed-node-3] => (item=sda16) 2025-09-27 00:41:15.447882 | orchestrator | 2025-09-27 00:41:15.447893 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:15.447904 | orchestrator | Saturday 27 September 2025 00:41:08 +0000 (0:00:01.148) 0:00:07.699 **** 2025-09-27 00:41:15.447916 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.447927 | orchestrator | 2025-09-27 00:41:15.447939 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:15.447950 | orchestrator | Saturday 27 September 2025 00:41:08 +0000 (0:00:00.251) 0:00:07.950 **** 2025-09-27 00:41:15.447960 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.447971 | orchestrator | 2025-09-27 00:41:15.447982 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:15.447993 | orchestrator | Saturday 27 September 2025 00:41:09 +0000 (0:00:00.211) 0:00:08.162 **** 2025-09-27 00:41:15.448003 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448014 | orchestrator | 2025-09-27 00:41:15.448025 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:15.448036 | orchestrator | Saturday 27 September 2025 00:41:09 +0000 (0:00:00.203) 0:00:08.366 **** 2025-09-27 00:41:15.448046 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448057 | orchestrator | 2025-09-27 00:41:15.448068 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-09-27 00:41:15.448078 | orchestrator | Saturday 27 September 2025 00:41:09 +0000 (0:00:00.188) 0:00:08.554 **** 2025-09-27 00:41:15.448089 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448099 | orchestrator | 2025-09-27 00:41:15.448110 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-09-27 00:41:15.448121 | orchestrator | Saturday 27 September 2025 00:41:09 +0000 (0:00:00.125) 0:00:08.680 **** 2025-09-27 00:41:15.448132 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '025d8a54-72cd-5dfc-843f-2890244ba468'}}) 2025-09-27 00:41:15.448143 | orchestrator | ok: [testbed-node-3] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '9ca7935d-e986-5962-b530-505e6c7ac609'}}) 2025-09-27 00:41:15.448154 | orchestrator | 2025-09-27 00:41:15.448165 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-09-27 00:41:15.448176 | orchestrator | Saturday 27 September 2025 00:41:09 +0000 (0:00:00.199) 0:00:08.880 **** 2025-09-27 00:41:15.448188 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'}) 2025-09-27 00:41:15.448249 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'}) 2025-09-27 00:41:15.448264 | orchestrator | 2025-09-27 00:41:15.448294 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-09-27 00:41:15.448312 | orchestrator | Saturday 27 September 2025 00:41:11 +0000 (0:00:01.974) 0:00:10.855 **** 2025-09-27 00:41:15.448325 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:15.448340 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:15.448353 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448365 | orchestrator | 2025-09-27 00:41:15.448377 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-09-27 00:41:15.448390 | orchestrator | Saturday 27 September 2025 00:41:11 +0000 (0:00:00.144) 0:00:10.999 **** 2025-09-27 00:41:15.448402 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'}) 2025-09-27 00:41:15.448414 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'}) 2025-09-27 00:41:15.448427 | orchestrator | 2025-09-27 00:41:15.448440 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-09-27 00:41:15.448452 | orchestrator | Saturday 27 September 2025 00:41:13 +0000 (0:00:01.465) 0:00:12.464 **** 2025-09-27 00:41:15.448465 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:15.448477 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:15.448490 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448502 | orchestrator | 2025-09-27 00:41:15.448514 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-09-27 00:41:15.448528 | orchestrator | Saturday 27 September 2025 00:41:13 +0000 (0:00:00.142) 0:00:12.607 **** 2025-09-27 00:41:15.448540 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448552 | orchestrator | 2025-09-27 00:41:15.448566 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-09-27 00:41:15.448596 | orchestrator | Saturday 27 September 2025 00:41:13 +0000 (0:00:00.134) 0:00:12.742 **** 2025-09-27 00:41:15.448608 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:15.448619 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:15.448630 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448641 | orchestrator | 2025-09-27 00:41:15.448651 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-09-27 00:41:15.448662 | orchestrator | Saturday 27 September 2025 00:41:13 +0000 (0:00:00.281) 0:00:13.024 **** 2025-09-27 00:41:15.448673 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448684 | orchestrator | 2025-09-27 00:41:15.448694 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-09-27 00:41:15.448705 | orchestrator | Saturday 27 September 2025 00:41:14 +0000 (0:00:00.171) 0:00:13.195 **** 2025-09-27 00:41:15.448716 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:15.448735 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:15.448746 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448756 | orchestrator | 2025-09-27 00:41:15.448767 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-09-27 00:41:15.448778 | orchestrator | Saturday 27 September 2025 00:41:14 +0000 (0:00:00.164) 0:00:13.360 **** 2025-09-27 00:41:15.448788 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448799 | orchestrator | 2025-09-27 00:41:15.448810 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-09-27 00:41:15.448820 | orchestrator | Saturday 27 September 2025 00:41:14 +0000 (0:00:00.148) 0:00:13.509 **** 2025-09-27 00:41:15.448831 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:15.448842 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:15.448853 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448864 | orchestrator | 2025-09-27 00:41:15.448874 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-09-27 00:41:15.448885 | orchestrator | Saturday 27 September 2025 00:41:14 +0000 (0:00:00.139) 0:00:13.649 **** 2025-09-27 00:41:15.448896 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:15.448907 | orchestrator | 2025-09-27 00:41:15.448918 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-09-27 00:41:15.448928 | orchestrator | Saturday 27 September 2025 00:41:14 +0000 (0:00:00.141) 0:00:13.790 **** 2025-09-27 00:41:15.448944 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:15.448955 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:15.448966 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.448977 | orchestrator | 2025-09-27 00:41:15.448988 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-09-27 00:41:15.448999 | orchestrator | Saturday 27 September 2025 00:41:14 +0000 (0:00:00.137) 0:00:13.928 **** 2025-09-27 00:41:15.449009 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:15.449020 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:15.449031 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.449042 | orchestrator | 2025-09-27 00:41:15.449052 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-09-27 00:41:15.449063 | orchestrator | Saturday 27 September 2025 00:41:14 +0000 (0:00:00.151) 0:00:14.079 **** 2025-09-27 00:41:15.449074 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:15.449085 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:15.449096 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.449106 | orchestrator | 2025-09-27 00:41:15.449117 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-09-27 00:41:15.449128 | orchestrator | Saturday 27 September 2025 00:41:15 +0000 (0:00:00.150) 0:00:14.229 **** 2025-09-27 00:41:15.449138 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.449157 | orchestrator | 2025-09-27 00:41:15.449168 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-09-27 00:41:15.449179 | orchestrator | Saturday 27 September 2025 00:41:15 +0000 (0:00:00.142) 0:00:14.372 **** 2025-09-27 00:41:15.449190 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:15.449201 | orchestrator | 2025-09-27 00:41:15.449235 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-09-27 00:41:21.914308 | orchestrator | Saturday 27 September 2025 00:41:15 +0000 (0:00:00.151) 0:00:14.523 **** 2025-09-27 00:41:21.914420 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.914436 | orchestrator | 2025-09-27 00:41:21.914449 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-09-27 00:41:21.914460 | orchestrator | Saturday 27 September 2025 00:41:15 +0000 (0:00:00.148) 0:00:14.671 **** 2025-09-27 00:41:21.914471 | orchestrator | ok: [testbed-node-3] => { 2025-09-27 00:41:21.914483 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-09-27 00:41:21.914494 | orchestrator | } 2025-09-27 00:41:21.914505 | orchestrator | 2025-09-27 00:41:21.914516 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-09-27 00:41:21.914527 | orchestrator | Saturday 27 September 2025 00:41:15 +0000 (0:00:00.320) 0:00:14.992 **** 2025-09-27 00:41:21.914538 | orchestrator | ok: [testbed-node-3] => { 2025-09-27 00:41:21.914548 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-09-27 00:41:21.914559 | orchestrator | } 2025-09-27 00:41:21.914570 | orchestrator | 2025-09-27 00:41:21.914581 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-09-27 00:41:21.914592 | orchestrator | Saturday 27 September 2025 00:41:16 +0000 (0:00:00.171) 0:00:15.163 **** 2025-09-27 00:41:21.914603 | orchestrator | ok: [testbed-node-3] => { 2025-09-27 00:41:21.914613 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-09-27 00:41:21.914624 | orchestrator | } 2025-09-27 00:41:21.914636 | orchestrator | 2025-09-27 00:41:21.914647 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-09-27 00:41:21.914658 | orchestrator | Saturday 27 September 2025 00:41:16 +0000 (0:00:00.122) 0:00:15.286 **** 2025-09-27 00:41:21.914669 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:21.914680 | orchestrator | 2025-09-27 00:41:21.914691 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-09-27 00:41:21.914701 | orchestrator | Saturday 27 September 2025 00:41:16 +0000 (0:00:00.648) 0:00:15.935 **** 2025-09-27 00:41:21.914712 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:21.914723 | orchestrator | 2025-09-27 00:41:21.914733 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-09-27 00:41:21.914744 | orchestrator | Saturday 27 September 2025 00:41:17 +0000 (0:00:00.542) 0:00:16.478 **** 2025-09-27 00:41:21.914755 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:21.914765 | orchestrator | 2025-09-27 00:41:21.914776 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-09-27 00:41:21.914787 | orchestrator | Saturday 27 September 2025 00:41:17 +0000 (0:00:00.539) 0:00:17.017 **** 2025-09-27 00:41:21.914800 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:21.914813 | orchestrator | 2025-09-27 00:41:21.914825 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-09-27 00:41:21.914837 | orchestrator | Saturday 27 September 2025 00:41:18 +0000 (0:00:00.176) 0:00:17.194 **** 2025-09-27 00:41:21.914849 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.914862 | orchestrator | 2025-09-27 00:41:21.914875 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-09-27 00:41:21.914887 | orchestrator | Saturday 27 September 2025 00:41:18 +0000 (0:00:00.119) 0:00:17.313 **** 2025-09-27 00:41:21.914900 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.914912 | orchestrator | 2025-09-27 00:41:21.914924 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-09-27 00:41:21.914936 | orchestrator | Saturday 27 September 2025 00:41:18 +0000 (0:00:00.117) 0:00:17.431 **** 2025-09-27 00:41:21.914948 | orchestrator | ok: [testbed-node-3] => { 2025-09-27 00:41:21.914989 | orchestrator |  "vgs_report": { 2025-09-27 00:41:21.915004 | orchestrator |  "vg": [] 2025-09-27 00:41:21.915017 | orchestrator |  } 2025-09-27 00:41:21.915030 | orchestrator | } 2025-09-27 00:41:21.915042 | orchestrator | 2025-09-27 00:41:21.915055 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-09-27 00:41:21.915067 | orchestrator | Saturday 27 September 2025 00:41:18 +0000 (0:00:00.190) 0:00:17.621 **** 2025-09-27 00:41:21.915079 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915091 | orchestrator | 2025-09-27 00:41:21.915104 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-09-27 00:41:21.915116 | orchestrator | Saturday 27 September 2025 00:41:18 +0000 (0:00:00.159) 0:00:17.781 **** 2025-09-27 00:41:21.915128 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915140 | orchestrator | 2025-09-27 00:41:21.915152 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-09-27 00:41:21.915162 | orchestrator | Saturday 27 September 2025 00:41:18 +0000 (0:00:00.145) 0:00:17.926 **** 2025-09-27 00:41:21.915173 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915183 | orchestrator | 2025-09-27 00:41:21.915194 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-09-27 00:41:21.915226 | orchestrator | Saturday 27 September 2025 00:41:19 +0000 (0:00:00.332) 0:00:18.259 **** 2025-09-27 00:41:21.915238 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915248 | orchestrator | 2025-09-27 00:41:21.915259 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-09-27 00:41:21.915270 | orchestrator | Saturday 27 September 2025 00:41:19 +0000 (0:00:00.156) 0:00:18.415 **** 2025-09-27 00:41:21.915281 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915291 | orchestrator | 2025-09-27 00:41:21.915321 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-09-27 00:41:21.915333 | orchestrator | Saturday 27 September 2025 00:41:19 +0000 (0:00:00.154) 0:00:18.569 **** 2025-09-27 00:41:21.915343 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915354 | orchestrator | 2025-09-27 00:41:21.915365 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-09-27 00:41:21.915375 | orchestrator | Saturday 27 September 2025 00:41:19 +0000 (0:00:00.160) 0:00:18.729 **** 2025-09-27 00:41:21.915386 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915396 | orchestrator | 2025-09-27 00:41:21.915407 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-09-27 00:41:21.915417 | orchestrator | Saturday 27 September 2025 00:41:19 +0000 (0:00:00.140) 0:00:18.870 **** 2025-09-27 00:41:21.915428 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915439 | orchestrator | 2025-09-27 00:41:21.915450 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-09-27 00:41:21.915478 | orchestrator | Saturday 27 September 2025 00:41:19 +0000 (0:00:00.137) 0:00:19.008 **** 2025-09-27 00:41:21.915489 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915500 | orchestrator | 2025-09-27 00:41:21.915511 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-09-27 00:41:21.915522 | orchestrator | Saturday 27 September 2025 00:41:20 +0000 (0:00:00.161) 0:00:19.169 **** 2025-09-27 00:41:21.915532 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915543 | orchestrator | 2025-09-27 00:41:21.915554 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-09-27 00:41:21.915565 | orchestrator | Saturday 27 September 2025 00:41:20 +0000 (0:00:00.135) 0:00:19.305 **** 2025-09-27 00:41:21.915575 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915586 | orchestrator | 2025-09-27 00:41:21.915597 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-09-27 00:41:21.915608 | orchestrator | Saturday 27 September 2025 00:41:20 +0000 (0:00:00.141) 0:00:19.447 **** 2025-09-27 00:41:21.915618 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915629 | orchestrator | 2025-09-27 00:41:21.915648 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-09-27 00:41:21.915659 | orchestrator | Saturday 27 September 2025 00:41:20 +0000 (0:00:00.130) 0:00:19.577 **** 2025-09-27 00:41:21.915670 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915681 | orchestrator | 2025-09-27 00:41:21.915692 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-09-27 00:41:21.915703 | orchestrator | Saturday 27 September 2025 00:41:20 +0000 (0:00:00.137) 0:00:19.715 **** 2025-09-27 00:41:21.915713 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915724 | orchestrator | 2025-09-27 00:41:21.915735 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-09-27 00:41:21.915745 | orchestrator | Saturday 27 September 2025 00:41:20 +0000 (0:00:00.127) 0:00:19.843 **** 2025-09-27 00:41:21.915758 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:21.915770 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:21.915781 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915792 | orchestrator | 2025-09-27 00:41:21.915803 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-09-27 00:41:21.915813 | orchestrator | Saturday 27 September 2025 00:41:20 +0000 (0:00:00.167) 0:00:20.011 **** 2025-09-27 00:41:21.915824 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:21.915835 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:21.915846 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915856 | orchestrator | 2025-09-27 00:41:21.915867 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-09-27 00:41:21.915878 | orchestrator | Saturday 27 September 2025 00:41:21 +0000 (0:00:00.324) 0:00:20.335 **** 2025-09-27 00:41:21.915894 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:21.915905 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:21.915916 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915927 | orchestrator | 2025-09-27 00:41:21.915938 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-09-27 00:41:21.915949 | orchestrator | Saturday 27 September 2025 00:41:21 +0000 (0:00:00.152) 0:00:20.487 **** 2025-09-27 00:41:21.915959 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:21.915970 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:21.915981 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.915992 | orchestrator | 2025-09-27 00:41:21.916002 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-09-27 00:41:21.916013 | orchestrator | Saturday 27 September 2025 00:41:21 +0000 (0:00:00.149) 0:00:20.637 **** 2025-09-27 00:41:21.916024 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:21.916035 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:21.916046 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:21.916063 | orchestrator | 2025-09-27 00:41:21.916074 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-09-27 00:41:21.916085 | orchestrator | Saturday 27 September 2025 00:41:21 +0000 (0:00:00.166) 0:00:20.804 **** 2025-09-27 00:41:21.916095 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:21.916112 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:27.713882 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:27.713992 | orchestrator | 2025-09-27 00:41:27.714009 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-09-27 00:41:27.714086 | orchestrator | Saturday 27 September 2025 00:41:21 +0000 (0:00:00.185) 0:00:20.989 **** 2025-09-27 00:41:27.714098 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:27.714111 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:27.714122 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:27.714133 | orchestrator | 2025-09-27 00:41:27.714144 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-09-27 00:41:27.714155 | orchestrator | Saturday 27 September 2025 00:41:22 +0000 (0:00:00.220) 0:00:21.210 **** 2025-09-27 00:41:27.714166 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:27.714178 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:27.714189 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:27.714200 | orchestrator | 2025-09-27 00:41:27.714270 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-09-27 00:41:27.714282 | orchestrator | Saturday 27 September 2025 00:41:22 +0000 (0:00:00.186) 0:00:21.397 **** 2025-09-27 00:41:27.714293 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:27.714305 | orchestrator | 2025-09-27 00:41:27.714316 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-09-27 00:41:27.714326 | orchestrator | Saturday 27 September 2025 00:41:22 +0000 (0:00:00.611) 0:00:22.008 **** 2025-09-27 00:41:27.714337 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:27.714348 | orchestrator | 2025-09-27 00:41:27.714359 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-09-27 00:41:27.714370 | orchestrator | Saturday 27 September 2025 00:41:23 +0000 (0:00:00.548) 0:00:22.557 **** 2025-09-27 00:41:27.714380 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:41:27.714391 | orchestrator | 2025-09-27 00:41:27.714402 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-09-27 00:41:27.714413 | orchestrator | Saturday 27 September 2025 00:41:23 +0000 (0:00:00.145) 0:00:22.702 **** 2025-09-27 00:41:27.714426 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'vg_name': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'}) 2025-09-27 00:41:27.714440 | orchestrator | ok: [testbed-node-3] => (item={'lv_name': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'vg_name': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'}) 2025-09-27 00:41:27.714453 | orchestrator | 2025-09-27 00:41:27.714465 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-09-27 00:41:27.714477 | orchestrator | Saturday 27 September 2025 00:41:23 +0000 (0:00:00.184) 0:00:22.887 **** 2025-09-27 00:41:27.714489 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:27.714524 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:27.714537 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:27.714549 | orchestrator | 2025-09-27 00:41:27.714561 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-09-27 00:41:27.714573 | orchestrator | Saturday 27 September 2025 00:41:23 +0000 (0:00:00.172) 0:00:23.060 **** 2025-09-27 00:41:27.714585 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:27.714597 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:27.714610 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:27.714622 | orchestrator | 2025-09-27 00:41:27.714635 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-09-27 00:41:27.714647 | orchestrator | Saturday 27 September 2025 00:41:24 +0000 (0:00:00.452) 0:00:23.513 **** 2025-09-27 00:41:27.714659 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'})  2025-09-27 00:41:27.714672 | orchestrator | skipping: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'})  2025-09-27 00:41:27.714684 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:41:27.714696 | orchestrator | 2025-09-27 00:41:27.714708 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-09-27 00:41:27.714720 | orchestrator | Saturday 27 September 2025 00:41:24 +0000 (0:00:00.152) 0:00:23.665 **** 2025-09-27 00:41:27.714732 | orchestrator | ok: [testbed-node-3] => { 2025-09-27 00:41:27.714745 | orchestrator |  "lvm_report": { 2025-09-27 00:41:27.714758 | orchestrator |  "lv": [ 2025-09-27 00:41:27.714771 | orchestrator |  { 2025-09-27 00:41:27.714800 | orchestrator |  "lv_name": "osd-block-025d8a54-72cd-5dfc-843f-2890244ba468", 2025-09-27 00:41:27.714813 | orchestrator |  "vg_name": "ceph-025d8a54-72cd-5dfc-843f-2890244ba468" 2025-09-27 00:41:27.714824 | orchestrator |  }, 2025-09-27 00:41:27.714834 | orchestrator |  { 2025-09-27 00:41:27.714845 | orchestrator |  "lv_name": "osd-block-9ca7935d-e986-5962-b530-505e6c7ac609", 2025-09-27 00:41:27.714856 | orchestrator |  "vg_name": "ceph-9ca7935d-e986-5962-b530-505e6c7ac609" 2025-09-27 00:41:27.714867 | orchestrator |  } 2025-09-27 00:41:27.714877 | orchestrator |  ], 2025-09-27 00:41:27.714888 | orchestrator |  "pv": [ 2025-09-27 00:41:27.714899 | orchestrator |  { 2025-09-27 00:41:27.714910 | orchestrator |  "pv_name": "/dev/sdb", 2025-09-27 00:41:27.714920 | orchestrator |  "vg_name": "ceph-025d8a54-72cd-5dfc-843f-2890244ba468" 2025-09-27 00:41:27.714931 | orchestrator |  }, 2025-09-27 00:41:27.714942 | orchestrator |  { 2025-09-27 00:41:27.714952 | orchestrator |  "pv_name": "/dev/sdc", 2025-09-27 00:41:27.714963 | orchestrator |  "vg_name": "ceph-9ca7935d-e986-5962-b530-505e6c7ac609" 2025-09-27 00:41:27.714973 | orchestrator |  } 2025-09-27 00:41:27.714984 | orchestrator |  ] 2025-09-27 00:41:27.714995 | orchestrator |  } 2025-09-27 00:41:27.715005 | orchestrator | } 2025-09-27 00:41:27.715016 | orchestrator | 2025-09-27 00:41:27.715027 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-09-27 00:41:27.715038 | orchestrator | 2025-09-27 00:41:27.715048 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-27 00:41:27.715059 | orchestrator | Saturday 27 September 2025 00:41:24 +0000 (0:00:00.290) 0:00:23.955 **** 2025-09-27 00:41:27.715070 | orchestrator | ok: [testbed-node-4 -> testbed-manager(192.168.16.5)] 2025-09-27 00:41:27.715088 | orchestrator | 2025-09-27 00:41:27.715099 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-27 00:41:27.715110 | orchestrator | Saturday 27 September 2025 00:41:25 +0000 (0:00:00.249) 0:00:24.204 **** 2025-09-27 00:41:27.715120 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:27.715131 | orchestrator | 2025-09-27 00:41:27.715142 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:27.715152 | orchestrator | Saturday 27 September 2025 00:41:25 +0000 (0:00:00.217) 0:00:24.421 **** 2025-09-27 00:41:27.715181 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop0) 2025-09-27 00:41:27.715192 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop1) 2025-09-27 00:41:27.715222 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop2) 2025-09-27 00:41:27.715234 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop3) 2025-09-27 00:41:27.715245 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop4) 2025-09-27 00:41:27.715256 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop5) 2025-09-27 00:41:27.715266 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop6) 2025-09-27 00:41:27.715282 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=loop7) 2025-09-27 00:41:27.715293 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sda) 2025-09-27 00:41:27.715304 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdb) 2025-09-27 00:41:27.715314 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdc) 2025-09-27 00:41:27.715325 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sdd) 2025-09-27 00:41:27.715336 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-4 => (item=sr0) 2025-09-27 00:41:27.715346 | orchestrator | 2025-09-27 00:41:27.715357 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:27.715368 | orchestrator | Saturday 27 September 2025 00:41:25 +0000 (0:00:00.402) 0:00:24.824 **** 2025-09-27 00:41:27.715379 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:27.715389 | orchestrator | 2025-09-27 00:41:27.715400 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:27.715410 | orchestrator | Saturday 27 September 2025 00:41:25 +0000 (0:00:00.193) 0:00:25.017 **** 2025-09-27 00:41:27.715421 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:27.715432 | orchestrator | 2025-09-27 00:41:27.715442 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:27.715453 | orchestrator | Saturday 27 September 2025 00:41:26 +0000 (0:00:00.198) 0:00:25.215 **** 2025-09-27 00:41:27.715463 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:27.715474 | orchestrator | 2025-09-27 00:41:27.715485 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:27.715495 | orchestrator | Saturday 27 September 2025 00:41:26 +0000 (0:00:00.191) 0:00:25.407 **** 2025-09-27 00:41:27.715506 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:27.715517 | orchestrator | 2025-09-27 00:41:27.715527 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:27.715538 | orchestrator | Saturday 27 September 2025 00:41:27 +0000 (0:00:00.775) 0:00:26.183 **** 2025-09-27 00:41:27.715549 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:27.715559 | orchestrator | 2025-09-27 00:41:27.715570 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:27.715581 | orchestrator | Saturday 27 September 2025 00:41:27 +0000 (0:00:00.208) 0:00:26.391 **** 2025-09-27 00:41:27.715591 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:27.715602 | orchestrator | 2025-09-27 00:41:27.715619 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:27.715630 | orchestrator | Saturday 27 September 2025 00:41:27 +0000 (0:00:00.202) 0:00:26.594 **** 2025-09-27 00:41:27.715641 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:27.715651 | orchestrator | 2025-09-27 00:41:27.715669 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:38.279685 | orchestrator | Saturday 27 September 2025 00:41:27 +0000 (0:00:00.198) 0:00:26.793 **** 2025-09-27 00:41:38.279816 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.279834 | orchestrator | 2025-09-27 00:41:38.279856 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:38.279869 | orchestrator | Saturday 27 September 2025 00:41:27 +0000 (0:00:00.193) 0:00:26.986 **** 2025-09-27 00:41:38.279881 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2) 2025-09-27 00:41:38.279893 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2) 2025-09-27 00:41:38.279904 | orchestrator | 2025-09-27 00:41:38.279916 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:38.279927 | orchestrator | Saturday 27 September 2025 00:41:28 +0000 (0:00:00.400) 0:00:27.387 **** 2025-09-27 00:41:38.279938 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b) 2025-09-27 00:41:38.279949 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b) 2025-09-27 00:41:38.279964 | orchestrator | 2025-09-27 00:41:38.279984 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:38.280002 | orchestrator | Saturday 27 September 2025 00:41:28 +0000 (0:00:00.431) 0:00:27.819 **** 2025-09-27 00:41:38.280022 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766) 2025-09-27 00:41:38.280042 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766) 2025-09-27 00:41:38.280062 | orchestrator | 2025-09-27 00:41:38.280075 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:38.280085 | orchestrator | Saturday 27 September 2025 00:41:29 +0000 (0:00:00.449) 0:00:28.268 **** 2025-09-27 00:41:38.280096 | orchestrator | ok: [testbed-node-4] => (item=scsi-0QEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408) 2025-09-27 00:41:38.280107 | orchestrator | ok: [testbed-node-4] => (item=scsi-SQEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408) 2025-09-27 00:41:38.280118 | orchestrator | 2025-09-27 00:41:38.280129 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:38.280140 | orchestrator | Saturday 27 September 2025 00:41:29 +0000 (0:00:00.443) 0:00:28.712 **** 2025-09-27 00:41:38.280150 | orchestrator | ok: [testbed-node-4] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-27 00:41:38.280161 | orchestrator | 2025-09-27 00:41:38.280172 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280183 | orchestrator | Saturday 27 September 2025 00:41:29 +0000 (0:00:00.327) 0:00:29.039 **** 2025-09-27 00:41:38.280194 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop0) 2025-09-27 00:41:38.280255 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop1) 2025-09-27 00:41:38.280268 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop2) 2025-09-27 00:41:38.280280 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop3) 2025-09-27 00:41:38.280292 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop4) 2025-09-27 00:41:38.280305 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop5) 2025-09-27 00:41:38.280317 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop6) 2025-09-27 00:41:38.280349 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=loop7) 2025-09-27 00:41:38.280361 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sda) 2025-09-27 00:41:38.280373 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdb) 2025-09-27 00:41:38.280385 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdc) 2025-09-27 00:41:38.280397 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sdd) 2025-09-27 00:41:38.280409 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-4 => (item=sr0) 2025-09-27 00:41:38.280421 | orchestrator | 2025-09-27 00:41:38.280434 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280446 | orchestrator | Saturday 27 September 2025 00:41:30 +0000 (0:00:00.691) 0:00:29.731 **** 2025-09-27 00:41:38.280458 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280470 | orchestrator | 2025-09-27 00:41:38.280483 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280496 | orchestrator | Saturday 27 September 2025 00:41:30 +0000 (0:00:00.187) 0:00:29.918 **** 2025-09-27 00:41:38.280508 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280520 | orchestrator | 2025-09-27 00:41:38.280533 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280613 | orchestrator | Saturday 27 September 2025 00:41:31 +0000 (0:00:00.217) 0:00:30.136 **** 2025-09-27 00:41:38.280625 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280636 | orchestrator | 2025-09-27 00:41:38.280647 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280658 | orchestrator | Saturday 27 September 2025 00:41:31 +0000 (0:00:00.255) 0:00:30.391 **** 2025-09-27 00:41:38.280668 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280679 | orchestrator | 2025-09-27 00:41:38.280710 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280721 | orchestrator | Saturday 27 September 2025 00:41:31 +0000 (0:00:00.232) 0:00:30.624 **** 2025-09-27 00:41:38.280732 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280743 | orchestrator | 2025-09-27 00:41:38.280753 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280764 | orchestrator | Saturday 27 September 2025 00:41:31 +0000 (0:00:00.191) 0:00:30.815 **** 2025-09-27 00:41:38.280774 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280785 | orchestrator | 2025-09-27 00:41:38.280796 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280807 | orchestrator | Saturday 27 September 2025 00:41:31 +0000 (0:00:00.227) 0:00:31.043 **** 2025-09-27 00:41:38.280817 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280828 | orchestrator | 2025-09-27 00:41:38.280839 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280849 | orchestrator | Saturday 27 September 2025 00:41:32 +0000 (0:00:00.217) 0:00:31.260 **** 2025-09-27 00:41:38.280860 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280870 | orchestrator | 2025-09-27 00:41:38.280881 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280892 | orchestrator | Saturday 27 September 2025 00:41:32 +0000 (0:00:00.224) 0:00:31.484 **** 2025-09-27 00:41:38.280902 | orchestrator | ok: [testbed-node-4] => (item=sda1) 2025-09-27 00:41:38.280913 | orchestrator | ok: [testbed-node-4] => (item=sda14) 2025-09-27 00:41:38.280924 | orchestrator | ok: [testbed-node-4] => (item=sda15) 2025-09-27 00:41:38.280935 | orchestrator | ok: [testbed-node-4] => (item=sda16) 2025-09-27 00:41:38.280945 | orchestrator | 2025-09-27 00:41:38.280957 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.280967 | orchestrator | Saturday 27 September 2025 00:41:33 +0000 (0:00:00.860) 0:00:32.344 **** 2025-09-27 00:41:38.280987 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.280998 | orchestrator | 2025-09-27 00:41:38.281009 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.281019 | orchestrator | Saturday 27 September 2025 00:41:33 +0000 (0:00:00.195) 0:00:32.540 **** 2025-09-27 00:41:38.281030 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.281040 | orchestrator | 2025-09-27 00:41:38.281051 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.281062 | orchestrator | Saturday 27 September 2025 00:41:33 +0000 (0:00:00.202) 0:00:32.742 **** 2025-09-27 00:41:38.281072 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.281083 | orchestrator | 2025-09-27 00:41:38.281093 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:38.281104 | orchestrator | Saturday 27 September 2025 00:41:34 +0000 (0:00:00.674) 0:00:33.416 **** 2025-09-27 00:41:38.281115 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.281126 | orchestrator | 2025-09-27 00:41:38.281136 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-09-27 00:41:38.281147 | orchestrator | Saturday 27 September 2025 00:41:34 +0000 (0:00:00.218) 0:00:33.635 **** 2025-09-27 00:41:38.281158 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.281168 | orchestrator | 2025-09-27 00:41:38.281179 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-09-27 00:41:38.281190 | orchestrator | Saturday 27 September 2025 00:41:34 +0000 (0:00:00.122) 0:00:33.757 **** 2025-09-27 00:41:38.281200 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': 'e62f59a6-4044-5e93-b85c-9f8cca280e9f'}}) 2025-09-27 00:41:38.281231 | orchestrator | ok: [testbed-node-4] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '634a63d2-bd22-5328-9676-28392545ed43'}}) 2025-09-27 00:41:38.281242 | orchestrator | 2025-09-27 00:41:38.281253 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-09-27 00:41:38.281264 | orchestrator | Saturday 27 September 2025 00:41:34 +0000 (0:00:00.183) 0:00:33.941 **** 2025-09-27 00:41:38.281276 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'}) 2025-09-27 00:41:38.281288 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'}) 2025-09-27 00:41:38.281298 | orchestrator | 2025-09-27 00:41:38.281309 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-09-27 00:41:38.281320 | orchestrator | Saturday 27 September 2025 00:41:36 +0000 (0:00:01.889) 0:00:35.830 **** 2025-09-27 00:41:38.281331 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:38.281343 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:38.281354 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:38.281365 | orchestrator | 2025-09-27 00:41:38.281376 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-09-27 00:41:38.281387 | orchestrator | Saturday 27 September 2025 00:41:36 +0000 (0:00:00.158) 0:00:35.989 **** 2025-09-27 00:41:38.281398 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'}) 2025-09-27 00:41:38.281408 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'}) 2025-09-27 00:41:38.281419 | orchestrator | 2025-09-27 00:41:38.281437 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-09-27 00:41:43.499923 | orchestrator | Saturday 27 September 2025 00:41:38 +0000 (0:00:01.361) 0:00:37.351 **** 2025-09-27 00:41:43.500050 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:43.500068 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:43.500079 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500091 | orchestrator | 2025-09-27 00:41:43.500104 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-09-27 00:41:43.500115 | orchestrator | Saturday 27 September 2025 00:41:38 +0000 (0:00:00.171) 0:00:37.522 **** 2025-09-27 00:41:43.500125 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500136 | orchestrator | 2025-09-27 00:41:43.500146 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-09-27 00:41:43.500158 | orchestrator | Saturday 27 September 2025 00:41:38 +0000 (0:00:00.160) 0:00:37.683 **** 2025-09-27 00:41:43.500169 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:43.500196 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:43.500244 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500255 | orchestrator | 2025-09-27 00:41:43.500266 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-09-27 00:41:43.500277 | orchestrator | Saturday 27 September 2025 00:41:38 +0000 (0:00:00.161) 0:00:37.844 **** 2025-09-27 00:41:43.500287 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500298 | orchestrator | 2025-09-27 00:41:43.500309 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-09-27 00:41:43.500320 | orchestrator | Saturday 27 September 2025 00:41:38 +0000 (0:00:00.152) 0:00:37.997 **** 2025-09-27 00:41:43.500331 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:43.500341 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:43.500352 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500363 | orchestrator | 2025-09-27 00:41:43.500374 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-09-27 00:41:43.500384 | orchestrator | Saturday 27 September 2025 00:41:39 +0000 (0:00:00.151) 0:00:38.148 **** 2025-09-27 00:41:43.500400 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500411 | orchestrator | 2025-09-27 00:41:43.500422 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-09-27 00:41:43.500433 | orchestrator | Saturday 27 September 2025 00:41:39 +0000 (0:00:00.325) 0:00:38.474 **** 2025-09-27 00:41:43.500444 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:43.500454 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:43.500468 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500480 | orchestrator | 2025-09-27 00:41:43.500492 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-09-27 00:41:43.500504 | orchestrator | Saturday 27 September 2025 00:41:39 +0000 (0:00:00.148) 0:00:38.622 **** 2025-09-27 00:41:43.500516 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:43.500530 | orchestrator | 2025-09-27 00:41:43.500543 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-09-27 00:41:43.500555 | orchestrator | Saturday 27 September 2025 00:41:39 +0000 (0:00:00.132) 0:00:38.755 **** 2025-09-27 00:41:43.500575 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:43.500588 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:43.500600 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500612 | orchestrator | 2025-09-27 00:41:43.500624 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-09-27 00:41:43.500636 | orchestrator | Saturday 27 September 2025 00:41:39 +0000 (0:00:00.141) 0:00:38.897 **** 2025-09-27 00:41:43.500648 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:43.500661 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:43.500674 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500686 | orchestrator | 2025-09-27 00:41:43.500698 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-09-27 00:41:43.500710 | orchestrator | Saturday 27 September 2025 00:41:39 +0000 (0:00:00.146) 0:00:39.043 **** 2025-09-27 00:41:43.500741 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:43.500754 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:43.500767 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500779 | orchestrator | 2025-09-27 00:41:43.500791 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-09-27 00:41:43.500804 | orchestrator | Saturday 27 September 2025 00:41:40 +0000 (0:00:00.141) 0:00:39.184 **** 2025-09-27 00:41:43.500815 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500826 | orchestrator | 2025-09-27 00:41:43.500836 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-09-27 00:41:43.500847 | orchestrator | Saturday 27 September 2025 00:41:40 +0000 (0:00:00.146) 0:00:39.330 **** 2025-09-27 00:41:43.500857 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500868 | orchestrator | 2025-09-27 00:41:43.500878 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-09-27 00:41:43.500889 | orchestrator | Saturday 27 September 2025 00:41:40 +0000 (0:00:00.117) 0:00:39.448 **** 2025-09-27 00:41:43.500899 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.500910 | orchestrator | 2025-09-27 00:41:43.500920 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-09-27 00:41:43.500931 | orchestrator | Saturday 27 September 2025 00:41:40 +0000 (0:00:00.129) 0:00:39.578 **** 2025-09-27 00:41:43.500941 | orchestrator | ok: [testbed-node-4] => { 2025-09-27 00:41:43.500952 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-09-27 00:41:43.500963 | orchestrator | } 2025-09-27 00:41:43.500973 | orchestrator | 2025-09-27 00:41:43.500984 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-09-27 00:41:43.500995 | orchestrator | Saturday 27 September 2025 00:41:40 +0000 (0:00:00.125) 0:00:39.704 **** 2025-09-27 00:41:43.501005 | orchestrator | ok: [testbed-node-4] => { 2025-09-27 00:41:43.501016 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-09-27 00:41:43.501026 | orchestrator | } 2025-09-27 00:41:43.501037 | orchestrator | 2025-09-27 00:41:43.501047 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-09-27 00:41:43.501058 | orchestrator | Saturday 27 September 2025 00:41:40 +0000 (0:00:00.119) 0:00:39.823 **** 2025-09-27 00:41:43.501069 | orchestrator | ok: [testbed-node-4] => { 2025-09-27 00:41:43.501079 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-09-27 00:41:43.501097 | orchestrator | } 2025-09-27 00:41:43.501107 | orchestrator | 2025-09-27 00:41:43.501118 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-09-27 00:41:43.501129 | orchestrator | Saturday 27 September 2025 00:41:40 +0000 (0:00:00.155) 0:00:39.978 **** 2025-09-27 00:41:43.501139 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:43.501150 | orchestrator | 2025-09-27 00:41:43.501160 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-09-27 00:41:43.501171 | orchestrator | Saturday 27 September 2025 00:41:41 +0000 (0:00:00.637) 0:00:40.616 **** 2025-09-27 00:41:43.501187 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:43.501198 | orchestrator | 2025-09-27 00:41:43.501229 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-09-27 00:41:43.501239 | orchestrator | Saturday 27 September 2025 00:41:42 +0000 (0:00:00.525) 0:00:41.141 **** 2025-09-27 00:41:43.501250 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:43.501261 | orchestrator | 2025-09-27 00:41:43.501272 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-09-27 00:41:43.501282 | orchestrator | Saturday 27 September 2025 00:41:42 +0000 (0:00:00.500) 0:00:41.642 **** 2025-09-27 00:41:43.501293 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:43.501304 | orchestrator | 2025-09-27 00:41:43.501314 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-09-27 00:41:43.501325 | orchestrator | Saturday 27 September 2025 00:41:42 +0000 (0:00:00.125) 0:00:41.767 **** 2025-09-27 00:41:43.501336 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.501346 | orchestrator | 2025-09-27 00:41:43.501357 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-09-27 00:41:43.501367 | orchestrator | Saturday 27 September 2025 00:41:42 +0000 (0:00:00.111) 0:00:41.879 **** 2025-09-27 00:41:43.501378 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.501389 | orchestrator | 2025-09-27 00:41:43.501399 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-09-27 00:41:43.501410 | orchestrator | Saturday 27 September 2025 00:41:42 +0000 (0:00:00.084) 0:00:41.963 **** 2025-09-27 00:41:43.501421 | orchestrator | ok: [testbed-node-4] => { 2025-09-27 00:41:43.501432 | orchestrator |  "vgs_report": { 2025-09-27 00:41:43.501443 | orchestrator |  "vg": [] 2025-09-27 00:41:43.501454 | orchestrator |  } 2025-09-27 00:41:43.501465 | orchestrator | } 2025-09-27 00:41:43.501476 | orchestrator | 2025-09-27 00:41:43.501487 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-09-27 00:41:43.501497 | orchestrator | Saturday 27 September 2025 00:41:43 +0000 (0:00:00.130) 0:00:42.094 **** 2025-09-27 00:41:43.501508 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.501519 | orchestrator | 2025-09-27 00:41:43.501529 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-09-27 00:41:43.501540 | orchestrator | Saturday 27 September 2025 00:41:43 +0000 (0:00:00.112) 0:00:42.207 **** 2025-09-27 00:41:43.501550 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.501561 | orchestrator | 2025-09-27 00:41:43.501572 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-09-27 00:41:43.501583 | orchestrator | Saturday 27 September 2025 00:41:43 +0000 (0:00:00.115) 0:00:42.323 **** 2025-09-27 00:41:43.501594 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.501604 | orchestrator | 2025-09-27 00:41:43.501615 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-09-27 00:41:43.501626 | orchestrator | Saturday 27 September 2025 00:41:43 +0000 (0:00:00.113) 0:00:42.436 **** 2025-09-27 00:41:43.501636 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:43.501647 | orchestrator | 2025-09-27 00:41:43.501658 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-09-27 00:41:43.501676 | orchestrator | Saturday 27 September 2025 00:41:43 +0000 (0:00:00.143) 0:00:42.580 **** 2025-09-27 00:41:47.876902 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877011 | orchestrator | 2025-09-27 00:41:47.877058 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-09-27 00:41:47.877075 | orchestrator | Saturday 27 September 2025 00:41:43 +0000 (0:00:00.108) 0:00:42.689 **** 2025-09-27 00:41:47.877089 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877103 | orchestrator | 2025-09-27 00:41:47.877116 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-09-27 00:41:47.877130 | orchestrator | Saturday 27 September 2025 00:41:43 +0000 (0:00:00.260) 0:00:42.949 **** 2025-09-27 00:41:47.877143 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877155 | orchestrator | 2025-09-27 00:41:47.877168 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-09-27 00:41:47.877181 | orchestrator | Saturday 27 September 2025 00:41:43 +0000 (0:00:00.127) 0:00:43.076 **** 2025-09-27 00:41:47.877194 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877270 | orchestrator | 2025-09-27 00:41:47.877285 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-09-27 00:41:47.877297 | orchestrator | Saturday 27 September 2025 00:41:44 +0000 (0:00:00.127) 0:00:43.204 **** 2025-09-27 00:41:47.877309 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877323 | orchestrator | 2025-09-27 00:41:47.877335 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-09-27 00:41:47.877348 | orchestrator | Saturday 27 September 2025 00:41:44 +0000 (0:00:00.128) 0:00:43.332 **** 2025-09-27 00:41:47.877361 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877373 | orchestrator | 2025-09-27 00:41:47.877386 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-09-27 00:41:47.877400 | orchestrator | Saturday 27 September 2025 00:41:44 +0000 (0:00:00.125) 0:00:43.458 **** 2025-09-27 00:41:47.877412 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877425 | orchestrator | 2025-09-27 00:41:47.877439 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-09-27 00:41:47.877452 | orchestrator | Saturday 27 September 2025 00:41:44 +0000 (0:00:00.124) 0:00:43.582 **** 2025-09-27 00:41:47.877464 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877477 | orchestrator | 2025-09-27 00:41:47.877490 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-09-27 00:41:47.877503 | orchestrator | Saturday 27 September 2025 00:41:44 +0000 (0:00:00.119) 0:00:43.702 **** 2025-09-27 00:41:47.877518 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877532 | orchestrator | 2025-09-27 00:41:47.877544 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-09-27 00:41:47.877557 | orchestrator | Saturday 27 September 2025 00:41:44 +0000 (0:00:00.133) 0:00:43.836 **** 2025-09-27 00:41:47.877571 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877585 | orchestrator | 2025-09-27 00:41:47.877598 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-09-27 00:41:47.877611 | orchestrator | Saturday 27 September 2025 00:41:44 +0000 (0:00:00.127) 0:00:43.964 **** 2025-09-27 00:41:47.877643 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.877659 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.877673 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877687 | orchestrator | 2025-09-27 00:41:47.877700 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-09-27 00:41:47.877713 | orchestrator | Saturday 27 September 2025 00:41:45 +0000 (0:00:00.137) 0:00:44.101 **** 2025-09-27 00:41:47.877725 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.877738 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.877764 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877777 | orchestrator | 2025-09-27 00:41:47.877790 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-09-27 00:41:47.877804 | orchestrator | Saturday 27 September 2025 00:41:45 +0000 (0:00:00.135) 0:00:44.237 **** 2025-09-27 00:41:47.877817 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.877830 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.877842 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877856 | orchestrator | 2025-09-27 00:41:47.877870 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-09-27 00:41:47.877884 | orchestrator | Saturday 27 September 2025 00:41:45 +0000 (0:00:00.144) 0:00:44.382 **** 2025-09-27 00:41:47.877896 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.877909 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.877923 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.877936 | orchestrator | 2025-09-27 00:41:47.877949 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-09-27 00:41:47.877983 | orchestrator | Saturday 27 September 2025 00:41:45 +0000 (0:00:00.265) 0:00:44.647 **** 2025-09-27 00:41:47.877997 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.878010 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.878084 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.878098 | orchestrator | 2025-09-27 00:41:47.878112 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-09-27 00:41:47.878125 | orchestrator | Saturday 27 September 2025 00:41:45 +0000 (0:00:00.142) 0:00:44.790 **** 2025-09-27 00:41:47.878138 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.878152 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.878166 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.878180 | orchestrator | 2025-09-27 00:41:47.878193 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-09-27 00:41:47.878233 | orchestrator | Saturday 27 September 2025 00:41:45 +0000 (0:00:00.133) 0:00:44.923 **** 2025-09-27 00:41:47.878246 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.878259 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.878272 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.878285 | orchestrator | 2025-09-27 00:41:47.878298 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-09-27 00:41:47.878310 | orchestrator | Saturday 27 September 2025 00:41:46 +0000 (0:00:00.166) 0:00:45.090 **** 2025-09-27 00:41:47.878324 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.878350 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.878365 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.878378 | orchestrator | 2025-09-27 00:41:47.878391 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-09-27 00:41:47.878447 | orchestrator | Saturday 27 September 2025 00:41:46 +0000 (0:00:00.147) 0:00:45.237 **** 2025-09-27 00:41:47.878465 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:47.878478 | orchestrator | 2025-09-27 00:41:47.878491 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-09-27 00:41:47.878503 | orchestrator | Saturday 27 September 2025 00:41:46 +0000 (0:00:00.513) 0:00:45.750 **** 2025-09-27 00:41:47.878517 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:47.878529 | orchestrator | 2025-09-27 00:41:47.878542 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-09-27 00:41:47.878555 | orchestrator | Saturday 27 September 2025 00:41:47 +0000 (0:00:00.538) 0:00:46.289 **** 2025-09-27 00:41:47.878568 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:41:47.878580 | orchestrator | 2025-09-27 00:41:47.878594 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-09-27 00:41:47.878606 | orchestrator | Saturday 27 September 2025 00:41:47 +0000 (0:00:00.152) 0:00:46.441 **** 2025-09-27 00:41:47.878620 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'vg_name': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'}) 2025-09-27 00:41:47.878634 | orchestrator | ok: [testbed-node-4] => (item={'lv_name': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'vg_name': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'}) 2025-09-27 00:41:47.878646 | orchestrator | 2025-09-27 00:41:47.878658 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-09-27 00:41:47.878671 | orchestrator | Saturday 27 September 2025 00:41:47 +0000 (0:00:00.183) 0:00:46.625 **** 2025-09-27 00:41:47.878684 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.878697 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.878710 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:47.878723 | orchestrator | 2025-09-27 00:41:47.878735 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-09-27 00:41:47.878748 | orchestrator | Saturday 27 September 2025 00:41:47 +0000 (0:00:00.166) 0:00:46.791 **** 2025-09-27 00:41:47.878762 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:47.878775 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:47.878805 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:54.236857 | orchestrator | 2025-09-27 00:41:54.237885 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-09-27 00:41:54.237962 | orchestrator | Saturday 27 September 2025 00:41:47 +0000 (0:00:00.163) 0:00:46.955 **** 2025-09-27 00:41:54.237991 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'})  2025-09-27 00:41:54.238006 | orchestrator | skipping: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'})  2025-09-27 00:41:54.238071 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:41:54.238087 | orchestrator | 2025-09-27 00:41:54.238099 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-09-27 00:41:54.238142 | orchestrator | Saturday 27 September 2025 00:41:48 +0000 (0:00:00.154) 0:00:47.109 **** 2025-09-27 00:41:54.238179 | orchestrator | ok: [testbed-node-4] => { 2025-09-27 00:41:54.238191 | orchestrator |  "lvm_report": { 2025-09-27 00:41:54.238233 | orchestrator |  "lv": [ 2025-09-27 00:41:54.238246 | orchestrator |  { 2025-09-27 00:41:54.238257 | orchestrator |  "lv_name": "osd-block-634a63d2-bd22-5328-9676-28392545ed43", 2025-09-27 00:41:54.238269 | orchestrator |  "vg_name": "ceph-634a63d2-bd22-5328-9676-28392545ed43" 2025-09-27 00:41:54.238280 | orchestrator |  }, 2025-09-27 00:41:54.238291 | orchestrator |  { 2025-09-27 00:41:54.238302 | orchestrator |  "lv_name": "osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f", 2025-09-27 00:41:54.238312 | orchestrator |  "vg_name": "ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f" 2025-09-27 00:41:54.238323 | orchestrator |  } 2025-09-27 00:41:54.238334 | orchestrator |  ], 2025-09-27 00:41:54.238344 | orchestrator |  "pv": [ 2025-09-27 00:41:54.238355 | orchestrator |  { 2025-09-27 00:41:54.238366 | orchestrator |  "pv_name": "/dev/sdb", 2025-09-27 00:41:54.238377 | orchestrator |  "vg_name": "ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f" 2025-09-27 00:41:54.238387 | orchestrator |  }, 2025-09-27 00:41:54.238398 | orchestrator |  { 2025-09-27 00:41:54.238409 | orchestrator |  "pv_name": "/dev/sdc", 2025-09-27 00:41:54.238419 | orchestrator |  "vg_name": "ceph-634a63d2-bd22-5328-9676-28392545ed43" 2025-09-27 00:41:54.238430 | orchestrator |  } 2025-09-27 00:41:54.238441 | orchestrator |  ] 2025-09-27 00:41:54.238451 | orchestrator |  } 2025-09-27 00:41:54.238462 | orchestrator | } 2025-09-27 00:41:54.238473 | orchestrator | 2025-09-27 00:41:54.238484 | orchestrator | PLAY [Ceph create LVM devices] ************************************************* 2025-09-27 00:41:54.238495 | orchestrator | 2025-09-27 00:41:54.238506 | orchestrator | TASK [Get extra vars for Ceph configuration] *********************************** 2025-09-27 00:41:54.238516 | orchestrator | Saturday 27 September 2025 00:41:48 +0000 (0:00:00.499) 0:00:47.608 **** 2025-09-27 00:41:54.238527 | orchestrator | ok: [testbed-node-5 -> testbed-manager(192.168.16.5)] 2025-09-27 00:41:54.238538 | orchestrator | 2025-09-27 00:41:54.238563 | orchestrator | TASK [Get initial list of available block devices] ***************************** 2025-09-27 00:41:54.238575 | orchestrator | Saturday 27 September 2025 00:41:48 +0000 (0:00:00.243) 0:00:47.852 **** 2025-09-27 00:41:54.238585 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:41:54.238597 | orchestrator | 2025-09-27 00:41:54.238608 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.238618 | orchestrator | Saturday 27 September 2025 00:41:48 +0000 (0:00:00.220) 0:00:48.073 **** 2025-09-27 00:41:54.238629 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop0) 2025-09-27 00:41:54.238640 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop1) 2025-09-27 00:41:54.238651 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop2) 2025-09-27 00:41:54.238662 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop3) 2025-09-27 00:41:54.238672 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop4) 2025-09-27 00:41:54.238683 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop5) 2025-09-27 00:41:54.238694 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop6) 2025-09-27 00:41:54.238704 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=loop7) 2025-09-27 00:41:54.238715 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sda) 2025-09-27 00:41:54.238726 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdb) 2025-09-27 00:41:54.238736 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdc) 2025-09-27 00:41:54.238755 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sdd) 2025-09-27 00:41:54.238766 | orchestrator | included: /ansible/tasks/_add-device-links.yml for testbed-node-5 => (item=sr0) 2025-09-27 00:41:54.238790 | orchestrator | 2025-09-27 00:41:54.238801 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.238812 | orchestrator | Saturday 27 September 2025 00:41:49 +0000 (0:00:00.422) 0:00:48.495 **** 2025-09-27 00:41:54.238822 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:41:54.238837 | orchestrator | 2025-09-27 00:41:54.238848 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.238859 | orchestrator | Saturday 27 September 2025 00:41:49 +0000 (0:00:00.203) 0:00:48.698 **** 2025-09-27 00:41:54.238870 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:41:54.238881 | orchestrator | 2025-09-27 00:41:54.238891 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.238929 | orchestrator | Saturday 27 September 2025 00:41:49 +0000 (0:00:00.202) 0:00:48.900 **** 2025-09-27 00:41:54.238941 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:41:54.238952 | orchestrator | 2025-09-27 00:41:54.238963 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.238973 | orchestrator | Saturday 27 September 2025 00:41:50 +0000 (0:00:00.195) 0:00:49.096 **** 2025-09-27 00:41:54.238984 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:41:54.238994 | orchestrator | 2025-09-27 00:41:54.239005 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239016 | orchestrator | Saturday 27 September 2025 00:41:50 +0000 (0:00:00.199) 0:00:49.295 **** 2025-09-27 00:41:54.239027 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:41:54.239037 | orchestrator | 2025-09-27 00:41:54.239048 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239059 | orchestrator | Saturday 27 September 2025 00:41:50 +0000 (0:00:00.212) 0:00:49.508 **** 2025-09-27 00:41:54.239069 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:41:54.239080 | orchestrator | 2025-09-27 00:41:54.239091 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239119 | orchestrator | Saturday 27 September 2025 00:41:51 +0000 (0:00:00.640) 0:00:50.148 **** 2025-09-27 00:41:54.239130 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:41:54.239141 | orchestrator | 2025-09-27 00:41:54.239152 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239175 | orchestrator | Saturday 27 September 2025 00:41:51 +0000 (0:00:00.212) 0:00:50.360 **** 2025-09-27 00:41:54.239185 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:41:54.239196 | orchestrator | 2025-09-27 00:41:54.239226 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239238 | orchestrator | Saturday 27 September 2025 00:41:51 +0000 (0:00:00.198) 0:00:50.559 **** 2025-09-27 00:41:54.239248 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0) 2025-09-27 00:41:54.239260 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0) 2025-09-27 00:41:54.239271 | orchestrator | 2025-09-27 00:41:54.239330 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239344 | orchestrator | Saturday 27 September 2025 00:41:51 +0000 (0:00:00.406) 0:00:50.965 **** 2025-09-27 00:41:54.239355 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d) 2025-09-27 00:41:54.239366 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d) 2025-09-27 00:41:54.239377 | orchestrator | 2025-09-27 00:41:54.239387 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239398 | orchestrator | Saturday 27 September 2025 00:41:52 +0000 (0:00:00.492) 0:00:51.458 **** 2025-09-27 00:41:54.239424 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f) 2025-09-27 00:41:54.239436 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f) 2025-09-27 00:41:54.239447 | orchestrator | 2025-09-27 00:41:54.239495 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239506 | orchestrator | Saturday 27 September 2025 00:41:52 +0000 (0:00:00.414) 0:00:51.872 **** 2025-09-27 00:41:54.239517 | orchestrator | ok: [testbed-node-5] => (item=scsi-0QEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706) 2025-09-27 00:41:54.239528 | orchestrator | ok: [testbed-node-5] => (item=scsi-SQEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706) 2025-09-27 00:41:54.239539 | orchestrator | 2025-09-27 00:41:54.239549 | orchestrator | TASK [Add known links to the list of available block devices] ****************** 2025-09-27 00:41:54.239575 | orchestrator | Saturday 27 September 2025 00:41:53 +0000 (0:00:00.561) 0:00:52.434 **** 2025-09-27 00:41:54.239586 | orchestrator | ok: [testbed-node-5] => (item=ata-QEMU_DVD-ROM_QM00001) 2025-09-27 00:41:54.239596 | orchestrator | 2025-09-27 00:41:54.239607 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:41:54.239618 | orchestrator | Saturday 27 September 2025 00:41:53 +0000 (0:00:00.396) 0:00:52.831 **** 2025-09-27 00:41:54.239629 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop0) 2025-09-27 00:41:54.239639 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop1) 2025-09-27 00:41:54.239650 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop2) 2025-09-27 00:41:54.239660 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop3) 2025-09-27 00:41:54.239671 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop4) 2025-09-27 00:41:54.239681 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop5) 2025-09-27 00:41:54.239692 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop6) 2025-09-27 00:41:54.239702 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=loop7) 2025-09-27 00:41:54.239713 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sda) 2025-09-27 00:41:54.239724 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdb) 2025-09-27 00:41:54.239734 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdc) 2025-09-27 00:41:54.239754 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sdd) 2025-09-27 00:42:03.594418 | orchestrator | included: /ansible/tasks/_add-device-partitions.yml for testbed-node-5 => (item=sr0) 2025-09-27 00:42:03.594533 | orchestrator | 2025-09-27 00:42:03.594549 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594561 | orchestrator | Saturday 27 September 2025 00:41:54 +0000 (0:00:00.475) 0:00:53.306 **** 2025-09-27 00:42:03.594572 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.594584 | orchestrator | 2025-09-27 00:42:03.594595 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594606 | orchestrator | Saturday 27 September 2025 00:41:54 +0000 (0:00:00.239) 0:00:53.546 **** 2025-09-27 00:42:03.594617 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.594628 | orchestrator | 2025-09-27 00:42:03.594639 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594650 | orchestrator | Saturday 27 September 2025 00:41:54 +0000 (0:00:00.253) 0:00:53.799 **** 2025-09-27 00:42:03.594661 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.594671 | orchestrator | 2025-09-27 00:42:03.594682 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594714 | orchestrator | Saturday 27 September 2025 00:41:55 +0000 (0:00:00.769) 0:00:54.568 **** 2025-09-27 00:42:03.594725 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.594736 | orchestrator | 2025-09-27 00:42:03.594747 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594757 | orchestrator | Saturday 27 September 2025 00:41:55 +0000 (0:00:00.227) 0:00:54.796 **** 2025-09-27 00:42:03.594768 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.594779 | orchestrator | 2025-09-27 00:42:03.594789 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594800 | orchestrator | Saturday 27 September 2025 00:41:55 +0000 (0:00:00.201) 0:00:54.997 **** 2025-09-27 00:42:03.594811 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.594821 | orchestrator | 2025-09-27 00:42:03.594832 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594843 | orchestrator | Saturday 27 September 2025 00:41:56 +0000 (0:00:00.209) 0:00:55.206 **** 2025-09-27 00:42:03.594853 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.594864 | orchestrator | 2025-09-27 00:42:03.594874 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594885 | orchestrator | Saturday 27 September 2025 00:41:56 +0000 (0:00:00.208) 0:00:55.415 **** 2025-09-27 00:42:03.594896 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.594907 | orchestrator | 2025-09-27 00:42:03.594917 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.594928 | orchestrator | Saturday 27 September 2025 00:41:56 +0000 (0:00:00.218) 0:00:55.633 **** 2025-09-27 00:42:03.594938 | orchestrator | ok: [testbed-node-5] => (item=sda1) 2025-09-27 00:42:03.594949 | orchestrator | ok: [testbed-node-5] => (item=sda14) 2025-09-27 00:42:03.594961 | orchestrator | ok: [testbed-node-5] => (item=sda15) 2025-09-27 00:42:03.594972 | orchestrator | ok: [testbed-node-5] => (item=sda16) 2025-09-27 00:42:03.594982 | orchestrator | 2025-09-27 00:42:03.594993 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.595004 | orchestrator | Saturday 27 September 2025 00:41:57 +0000 (0:00:00.684) 0:00:56.318 **** 2025-09-27 00:42:03.595015 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595025 | orchestrator | 2025-09-27 00:42:03.595036 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.595046 | orchestrator | Saturday 27 September 2025 00:41:57 +0000 (0:00:00.210) 0:00:56.528 **** 2025-09-27 00:42:03.595057 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595068 | orchestrator | 2025-09-27 00:42:03.595079 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.595089 | orchestrator | Saturday 27 September 2025 00:41:57 +0000 (0:00:00.224) 0:00:56.753 **** 2025-09-27 00:42:03.595100 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595111 | orchestrator | 2025-09-27 00:42:03.595121 | orchestrator | TASK [Add known partitions to the list of available block devices] ************* 2025-09-27 00:42:03.595132 | orchestrator | Saturday 27 September 2025 00:41:57 +0000 (0:00:00.210) 0:00:56.963 **** 2025-09-27 00:42:03.595142 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595153 | orchestrator | 2025-09-27 00:42:03.595164 | orchestrator | TASK [Check whether ceph_db_wal_devices is used exclusively] ******************* 2025-09-27 00:42:03.595174 | orchestrator | Saturday 27 September 2025 00:41:58 +0000 (0:00:00.206) 0:00:57.170 **** 2025-09-27 00:42:03.595185 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595196 | orchestrator | 2025-09-27 00:42:03.595232 | orchestrator | TASK [Create dict of block VGs -> PVs from ceph_osd_devices] ******************* 2025-09-27 00:42:03.595244 | orchestrator | Saturday 27 September 2025 00:41:58 +0000 (0:00:00.399) 0:00:57.570 **** 2025-09-27 00:42:03.595254 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdb', 'value': {'osd_lvm_uuid': '03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}}) 2025-09-27 00:42:03.595266 | orchestrator | ok: [testbed-node-5] => (item={'key': 'sdc', 'value': {'osd_lvm_uuid': '26537eb5-d37a-51fe-a7ad-0ae3582304de'}}) 2025-09-27 00:42:03.595284 | orchestrator | 2025-09-27 00:42:03.595295 | orchestrator | TASK [Create block VGs] ******************************************************** 2025-09-27 00:42:03.595306 | orchestrator | Saturday 27 September 2025 00:41:58 +0000 (0:00:00.197) 0:00:57.768 **** 2025-09-27 00:42:03.595318 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}) 2025-09-27 00:42:03.595329 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'}) 2025-09-27 00:42:03.595340 | orchestrator | 2025-09-27 00:42:03.595351 | orchestrator | TASK [Print 'Create block VGs'] ************************************************ 2025-09-27 00:42:03.595378 | orchestrator | Saturday 27 September 2025 00:42:00 +0000 (0:00:01.895) 0:00:59.663 **** 2025-09-27 00:42:03.595390 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:03.595402 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:03.595413 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595424 | orchestrator | 2025-09-27 00:42:03.595434 | orchestrator | TASK [Create block LVs] ******************************************************** 2025-09-27 00:42:03.595445 | orchestrator | Saturday 27 September 2025 00:42:00 +0000 (0:00:00.156) 0:00:59.819 **** 2025-09-27 00:42:03.595456 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}) 2025-09-27 00:42:03.595485 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'}) 2025-09-27 00:42:03.595497 | orchestrator | 2025-09-27 00:42:03.595508 | orchestrator | TASK [Print 'Create block LVs'] ************************************************ 2025-09-27 00:42:03.595519 | orchestrator | Saturday 27 September 2025 00:42:02 +0000 (0:00:01.333) 0:01:01.153 **** 2025-09-27 00:42:03.595529 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:03.595541 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:03.595551 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595562 | orchestrator | 2025-09-27 00:42:03.595573 | orchestrator | TASK [Create DB VGs] *********************************************************** 2025-09-27 00:42:03.595584 | orchestrator | Saturday 27 September 2025 00:42:02 +0000 (0:00:00.161) 0:01:01.315 **** 2025-09-27 00:42:03.595594 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595605 | orchestrator | 2025-09-27 00:42:03.595616 | orchestrator | TASK [Print 'Create DB VGs'] *************************************************** 2025-09-27 00:42:03.595627 | orchestrator | Saturday 27 September 2025 00:42:02 +0000 (0:00:00.149) 0:01:01.464 **** 2025-09-27 00:42:03.595638 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:03.595653 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:03.595664 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595675 | orchestrator | 2025-09-27 00:42:03.595686 | orchestrator | TASK [Create WAL VGs] ********************************************************** 2025-09-27 00:42:03.595697 | orchestrator | Saturday 27 September 2025 00:42:02 +0000 (0:00:00.164) 0:01:01.629 **** 2025-09-27 00:42:03.595707 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595725 | orchestrator | 2025-09-27 00:42:03.595736 | orchestrator | TASK [Print 'Create WAL VGs'] ************************************************** 2025-09-27 00:42:03.595746 | orchestrator | Saturday 27 September 2025 00:42:02 +0000 (0:00:00.135) 0:01:01.765 **** 2025-09-27 00:42:03.595757 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:03.595768 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:03.595779 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595790 | orchestrator | 2025-09-27 00:42:03.595800 | orchestrator | TASK [Create DB+WAL VGs] ******************************************************* 2025-09-27 00:42:03.595811 | orchestrator | Saturday 27 September 2025 00:42:02 +0000 (0:00:00.143) 0:01:01.909 **** 2025-09-27 00:42:03.595822 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595833 | orchestrator | 2025-09-27 00:42:03.595843 | orchestrator | TASK [Print 'Create DB+WAL VGs'] *********************************************** 2025-09-27 00:42:03.595854 | orchestrator | Saturday 27 September 2025 00:42:02 +0000 (0:00:00.127) 0:01:02.036 **** 2025-09-27 00:42:03.595865 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:03.595876 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:03.595886 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:03.595897 | orchestrator | 2025-09-27 00:42:03.595908 | orchestrator | TASK [Prepare variables for OSD count check] *********************************** 2025-09-27 00:42:03.595919 | orchestrator | Saturday 27 September 2025 00:42:03 +0000 (0:00:00.149) 0:01:02.186 **** 2025-09-27 00:42:03.595930 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:03.595940 | orchestrator | 2025-09-27 00:42:03.595951 | orchestrator | TASK [Count OSDs put on ceph_db_devices defined in lvm_volumes] **************** 2025-09-27 00:42:03.595962 | orchestrator | Saturday 27 September 2025 00:42:03 +0000 (0:00:00.139) 0:01:02.326 **** 2025-09-27 00:42:03.595980 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:09.637024 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:09.637136 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.637154 | orchestrator | 2025-09-27 00:42:09.637166 | orchestrator | TASK [Count OSDs put on ceph_wal_devices defined in lvm_volumes] *************** 2025-09-27 00:42:09.637179 | orchestrator | Saturday 27 September 2025 00:42:03 +0000 (0:00:00.348) 0:01:02.675 **** 2025-09-27 00:42:09.637191 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:09.637202 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:09.637254 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.637266 | orchestrator | 2025-09-27 00:42:09.637277 | orchestrator | TASK [Count OSDs put on ceph_db_wal_devices defined in lvm_volumes] ************ 2025-09-27 00:42:09.637288 | orchestrator | Saturday 27 September 2025 00:42:03 +0000 (0:00:00.155) 0:01:02.830 **** 2025-09-27 00:42:09.637299 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:09.637310 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:09.637321 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.637353 | orchestrator | 2025-09-27 00:42:09.637364 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB VG] ********************* 2025-09-27 00:42:09.637375 | orchestrator | Saturday 27 September 2025 00:42:03 +0000 (0:00:00.156) 0:01:02.987 **** 2025-09-27 00:42:09.637386 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.637396 | orchestrator | 2025-09-27 00:42:09.637407 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a WAL VG] ******************** 2025-09-27 00:42:09.637417 | orchestrator | Saturday 27 September 2025 00:42:04 +0000 (0:00:00.137) 0:01:03.124 **** 2025-09-27 00:42:09.637428 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.637439 | orchestrator | 2025-09-27 00:42:09.637449 | orchestrator | TASK [Fail if number of OSDs exceeds num_osds for a DB+WAL VG] ***************** 2025-09-27 00:42:09.637460 | orchestrator | Saturday 27 September 2025 00:42:04 +0000 (0:00:00.154) 0:01:03.278 **** 2025-09-27 00:42:09.637470 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.637481 | orchestrator | 2025-09-27 00:42:09.637491 | orchestrator | TASK [Print number of OSDs wanted per DB VG] *********************************** 2025-09-27 00:42:09.637517 | orchestrator | Saturday 27 September 2025 00:42:04 +0000 (0:00:00.140) 0:01:03.419 **** 2025-09-27 00:42:09.637529 | orchestrator | ok: [testbed-node-5] => { 2025-09-27 00:42:09.637540 | orchestrator |  "_num_osds_wanted_per_db_vg": {} 2025-09-27 00:42:09.637551 | orchestrator | } 2025-09-27 00:42:09.637562 | orchestrator | 2025-09-27 00:42:09.637572 | orchestrator | TASK [Print number of OSDs wanted per WAL VG] ********************************** 2025-09-27 00:42:09.637583 | orchestrator | Saturday 27 September 2025 00:42:04 +0000 (0:00:00.150) 0:01:03.570 **** 2025-09-27 00:42:09.637593 | orchestrator | ok: [testbed-node-5] => { 2025-09-27 00:42:09.637604 | orchestrator |  "_num_osds_wanted_per_wal_vg": {} 2025-09-27 00:42:09.637615 | orchestrator | } 2025-09-27 00:42:09.637625 | orchestrator | 2025-09-27 00:42:09.637636 | orchestrator | TASK [Print number of OSDs wanted per DB+WAL VG] ******************************* 2025-09-27 00:42:09.637647 | orchestrator | Saturday 27 September 2025 00:42:04 +0000 (0:00:00.149) 0:01:03.720 **** 2025-09-27 00:42:09.637658 | orchestrator | ok: [testbed-node-5] => { 2025-09-27 00:42:09.637669 | orchestrator |  "_num_osds_wanted_per_db_wal_vg": {} 2025-09-27 00:42:09.637680 | orchestrator | } 2025-09-27 00:42:09.637690 | orchestrator | 2025-09-27 00:42:09.637701 | orchestrator | TASK [Gather DB VGs with total and available size in bytes] ******************** 2025-09-27 00:42:09.637712 | orchestrator | Saturday 27 September 2025 00:42:04 +0000 (0:00:00.136) 0:01:03.856 **** 2025-09-27 00:42:09.637722 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:09.637733 | orchestrator | 2025-09-27 00:42:09.637744 | orchestrator | TASK [Gather WAL VGs with total and available size in bytes] ******************* 2025-09-27 00:42:09.637754 | orchestrator | Saturday 27 September 2025 00:42:05 +0000 (0:00:00.529) 0:01:04.386 **** 2025-09-27 00:42:09.637765 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:09.637776 | orchestrator | 2025-09-27 00:42:09.637786 | orchestrator | TASK [Gather DB+WAL VGs with total and available size in bytes] **************** 2025-09-27 00:42:09.637797 | orchestrator | Saturday 27 September 2025 00:42:05 +0000 (0:00:00.558) 0:01:04.945 **** 2025-09-27 00:42:09.637807 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:09.637818 | orchestrator | 2025-09-27 00:42:09.637829 | orchestrator | TASK [Combine JSON from _db/wal/db_wal_vgs_cmd_output] ************************* 2025-09-27 00:42:09.637839 | orchestrator | Saturday 27 September 2025 00:42:06 +0000 (0:00:00.511) 0:01:05.457 **** 2025-09-27 00:42:09.637850 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:09.637860 | orchestrator | 2025-09-27 00:42:09.637871 | orchestrator | TASK [Calculate VG sizes (without buffer)] ************************************* 2025-09-27 00:42:09.637881 | orchestrator | Saturday 27 September 2025 00:42:06 +0000 (0:00:00.357) 0:01:05.814 **** 2025-09-27 00:42:09.637892 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.637903 | orchestrator | 2025-09-27 00:42:09.637913 | orchestrator | TASK [Calculate VG sizes (with buffer)] **************************************** 2025-09-27 00:42:09.637924 | orchestrator | Saturday 27 September 2025 00:42:06 +0000 (0:00:00.111) 0:01:05.926 **** 2025-09-27 00:42:09.637942 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.637953 | orchestrator | 2025-09-27 00:42:09.637963 | orchestrator | TASK [Print LVM VGs report data] *********************************************** 2025-09-27 00:42:09.637974 | orchestrator | Saturday 27 September 2025 00:42:06 +0000 (0:00:00.115) 0:01:06.041 **** 2025-09-27 00:42:09.637985 | orchestrator | ok: [testbed-node-5] => { 2025-09-27 00:42:09.637996 | orchestrator |  "vgs_report": { 2025-09-27 00:42:09.638007 | orchestrator |  "vg": [] 2025-09-27 00:42:09.638089 | orchestrator |  } 2025-09-27 00:42:09.638105 | orchestrator | } 2025-09-27 00:42:09.638116 | orchestrator | 2025-09-27 00:42:09.638127 | orchestrator | TASK [Print LVM VG sizes] ****************************************************** 2025-09-27 00:42:09.638138 | orchestrator | Saturday 27 September 2025 00:42:07 +0000 (0:00:00.140) 0:01:06.182 **** 2025-09-27 00:42:09.638149 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638160 | orchestrator | 2025-09-27 00:42:09.638171 | orchestrator | TASK [Calculate size needed for LVs on ceph_db_devices] ************************ 2025-09-27 00:42:09.638181 | orchestrator | Saturday 27 September 2025 00:42:07 +0000 (0:00:00.120) 0:01:06.302 **** 2025-09-27 00:42:09.638192 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638222 | orchestrator | 2025-09-27 00:42:09.638234 | orchestrator | TASK [Print size needed for LVs on ceph_db_devices] **************************** 2025-09-27 00:42:09.638245 | orchestrator | Saturday 27 September 2025 00:42:07 +0000 (0:00:00.134) 0:01:06.437 **** 2025-09-27 00:42:09.638255 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638266 | orchestrator | 2025-09-27 00:42:09.638277 | orchestrator | TASK [Fail if size of DB LVs on ceph_db_devices > available] ******************* 2025-09-27 00:42:09.638288 | orchestrator | Saturday 27 September 2025 00:42:07 +0000 (0:00:00.130) 0:01:06.568 **** 2025-09-27 00:42:09.638299 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638309 | orchestrator | 2025-09-27 00:42:09.638320 | orchestrator | TASK [Calculate size needed for LVs on ceph_wal_devices] *********************** 2025-09-27 00:42:09.638331 | orchestrator | Saturday 27 September 2025 00:42:07 +0000 (0:00:00.130) 0:01:06.699 **** 2025-09-27 00:42:09.638341 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638352 | orchestrator | 2025-09-27 00:42:09.638363 | orchestrator | TASK [Print size needed for LVs on ceph_wal_devices] *************************** 2025-09-27 00:42:09.638373 | orchestrator | Saturday 27 September 2025 00:42:07 +0000 (0:00:00.140) 0:01:06.840 **** 2025-09-27 00:42:09.638384 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638395 | orchestrator | 2025-09-27 00:42:09.638406 | orchestrator | TASK [Fail if size of WAL LVs on ceph_wal_devices > available] ***************** 2025-09-27 00:42:09.638416 | orchestrator | Saturday 27 September 2025 00:42:07 +0000 (0:00:00.137) 0:01:06.977 **** 2025-09-27 00:42:09.638427 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638438 | orchestrator | 2025-09-27 00:42:09.638449 | orchestrator | TASK [Calculate size needed for WAL LVs on ceph_db_wal_devices] **************** 2025-09-27 00:42:09.638459 | orchestrator | Saturday 27 September 2025 00:42:08 +0000 (0:00:00.131) 0:01:07.108 **** 2025-09-27 00:42:09.638470 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638481 | orchestrator | 2025-09-27 00:42:09.638491 | orchestrator | TASK [Print size needed for WAL LVs on ceph_db_wal_devices] ******************** 2025-09-27 00:42:09.638502 | orchestrator | Saturday 27 September 2025 00:42:08 +0000 (0:00:00.126) 0:01:07.235 **** 2025-09-27 00:42:09.638513 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638523 | orchestrator | 2025-09-27 00:42:09.638534 | orchestrator | TASK [Calculate size needed for DB LVs on ceph_db_wal_devices] ***************** 2025-09-27 00:42:09.638551 | orchestrator | Saturday 27 September 2025 00:42:08 +0000 (0:00:00.326) 0:01:07.561 **** 2025-09-27 00:42:09.638562 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638573 | orchestrator | 2025-09-27 00:42:09.638584 | orchestrator | TASK [Print size needed for DB LVs on ceph_db_wal_devices] ********************* 2025-09-27 00:42:09.638594 | orchestrator | Saturday 27 September 2025 00:42:08 +0000 (0:00:00.144) 0:01:07.706 **** 2025-09-27 00:42:09.638605 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638623 | orchestrator | 2025-09-27 00:42:09.638634 | orchestrator | TASK [Fail if size of DB+WAL LVs on ceph_db_wal_devices > available] *********** 2025-09-27 00:42:09.638645 | orchestrator | Saturday 27 September 2025 00:42:08 +0000 (0:00:00.134) 0:01:07.840 **** 2025-09-27 00:42:09.638656 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638667 | orchestrator | 2025-09-27 00:42:09.638677 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_devices] ************************* 2025-09-27 00:42:09.638688 | orchestrator | Saturday 27 September 2025 00:42:08 +0000 (0:00:00.135) 0:01:07.976 **** 2025-09-27 00:42:09.638699 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638709 | orchestrator | 2025-09-27 00:42:09.638720 | orchestrator | TASK [Fail if DB LV size < 30 GiB for ceph_db_wal_devices] ********************* 2025-09-27 00:42:09.638731 | orchestrator | Saturday 27 September 2025 00:42:09 +0000 (0:00:00.139) 0:01:08.115 **** 2025-09-27 00:42:09.638741 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638752 | orchestrator | 2025-09-27 00:42:09.638762 | orchestrator | TASK [Create DB LVs for ceph_db_devices] *************************************** 2025-09-27 00:42:09.638773 | orchestrator | Saturday 27 September 2025 00:42:09 +0000 (0:00:00.128) 0:01:08.244 **** 2025-09-27 00:42:09.638784 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:09.638795 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:09.638806 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638817 | orchestrator | 2025-09-27 00:42:09.638827 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_devices'] ******************************* 2025-09-27 00:42:09.638838 | orchestrator | Saturday 27 September 2025 00:42:09 +0000 (0:00:00.164) 0:01:08.408 **** 2025-09-27 00:42:09.638849 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:09.638860 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:09.638870 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:09.638881 | orchestrator | 2025-09-27 00:42:09.638892 | orchestrator | TASK [Create WAL LVs for ceph_wal_devices] ************************************* 2025-09-27 00:42:09.638903 | orchestrator | Saturday 27 September 2025 00:42:09 +0000 (0:00:00.152) 0:01:08.560 **** 2025-09-27 00:42:09.638922 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.629514 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.629637 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.629660 | orchestrator | 2025-09-27 00:42:12.629673 | orchestrator | TASK [Print 'Create WAL LVs for ceph_wal_devices'] ***************************** 2025-09-27 00:42:12.629686 | orchestrator | Saturday 27 September 2025 00:42:09 +0000 (0:00:00.157) 0:01:08.717 **** 2025-09-27 00:42:12.629698 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.629709 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.629720 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.629731 | orchestrator | 2025-09-27 00:42:12.629742 | orchestrator | TASK [Create WAL LVs for ceph_db_wal_devices] ********************************** 2025-09-27 00:42:12.629753 | orchestrator | Saturday 27 September 2025 00:42:09 +0000 (0:00:00.152) 0:01:08.870 **** 2025-09-27 00:42:12.629764 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.629799 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.629810 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.629821 | orchestrator | 2025-09-27 00:42:12.629832 | orchestrator | TASK [Print 'Create WAL LVs for ceph_db_wal_devices'] ************************** 2025-09-27 00:42:12.629843 | orchestrator | Saturday 27 September 2025 00:42:09 +0000 (0:00:00.143) 0:01:09.013 **** 2025-09-27 00:42:12.629853 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.629864 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.629879 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.629898 | orchestrator | 2025-09-27 00:42:12.629917 | orchestrator | TASK [Create DB LVs for ceph_db_wal_devices] *********************************** 2025-09-27 00:42:12.629944 | orchestrator | Saturday 27 September 2025 00:42:10 +0000 (0:00:00.147) 0:01:09.160 **** 2025-09-27 00:42:12.629968 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.629988 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.630000 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.630011 | orchestrator | 2025-09-27 00:42:12.630085 | orchestrator | TASK [Print 'Create DB LVs for ceph_db_wal_devices'] *************************** 2025-09-27 00:42:12.630098 | orchestrator | Saturday 27 September 2025 00:42:10 +0000 (0:00:00.329) 0:01:09.490 **** 2025-09-27 00:42:12.630147 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.630160 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.630172 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.630185 | orchestrator | 2025-09-27 00:42:12.630196 | orchestrator | TASK [Get list of Ceph LVs with associated VGs] ******************************** 2025-09-27 00:42:12.630236 | orchestrator | Saturday 27 September 2025 00:42:10 +0000 (0:00:00.158) 0:01:09.649 **** 2025-09-27 00:42:12.630247 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:12.630259 | orchestrator | 2025-09-27 00:42:12.630270 | orchestrator | TASK [Get list of Ceph PVs with associated VGs] ******************************** 2025-09-27 00:42:12.630281 | orchestrator | Saturday 27 September 2025 00:42:11 +0000 (0:00:00.526) 0:01:10.176 **** 2025-09-27 00:42:12.630292 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:12.630302 | orchestrator | 2025-09-27 00:42:12.630313 | orchestrator | TASK [Combine JSON from _lvs_cmd_output/_pvs_cmd_output] *********************** 2025-09-27 00:42:12.630324 | orchestrator | Saturday 27 September 2025 00:42:11 +0000 (0:00:00.565) 0:01:10.741 **** 2025-09-27 00:42:12.630335 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:12.630345 | orchestrator | 2025-09-27 00:42:12.630356 | orchestrator | TASK [Create list of VG/LV names] ********************************************** 2025-09-27 00:42:12.630367 | orchestrator | Saturday 27 September 2025 00:42:11 +0000 (0:00:00.148) 0:01:10.889 **** 2025-09-27 00:42:12.630377 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'vg_name': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}) 2025-09-27 00:42:12.630390 | orchestrator | ok: [testbed-node-5] => (item={'lv_name': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'vg_name': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'}) 2025-09-27 00:42:12.630401 | orchestrator | 2025-09-27 00:42:12.630411 | orchestrator | TASK [Fail if block LV defined in lvm_volumes is missing] ********************** 2025-09-27 00:42:12.630441 | orchestrator | Saturday 27 September 2025 00:42:11 +0000 (0:00:00.174) 0:01:11.064 **** 2025-09-27 00:42:12.630483 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.630502 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.630521 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.630539 | orchestrator | 2025-09-27 00:42:12.630557 | orchestrator | TASK [Fail if DB LV defined in lvm_volumes is missing] ************************* 2025-09-27 00:42:12.630574 | orchestrator | Saturday 27 September 2025 00:42:12 +0000 (0:00:00.166) 0:01:11.230 **** 2025-09-27 00:42:12.630586 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.630596 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.630608 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.630619 | orchestrator | 2025-09-27 00:42:12.630629 | orchestrator | TASK [Fail if WAL LV defined in lvm_volumes is missing] ************************ 2025-09-27 00:42:12.630640 | orchestrator | Saturday 27 September 2025 00:42:12 +0000 (0:00:00.155) 0:01:11.385 **** 2025-09-27 00:42:12.630653 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'})  2025-09-27 00:42:12.630692 | orchestrator | skipping: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'})  2025-09-27 00:42:12.630722 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:12.630743 | orchestrator | 2025-09-27 00:42:12.630762 | orchestrator | TASK [Print LVM report data] *************************************************** 2025-09-27 00:42:12.630780 | orchestrator | Saturday 27 September 2025 00:42:12 +0000 (0:00:00.158) 0:01:11.544 **** 2025-09-27 00:42:12.630792 | orchestrator | ok: [testbed-node-5] => { 2025-09-27 00:42:12.630803 | orchestrator |  "lvm_report": { 2025-09-27 00:42:12.630814 | orchestrator |  "lv": [ 2025-09-27 00:42:12.630825 | orchestrator |  { 2025-09-27 00:42:12.630836 | orchestrator |  "lv_name": "osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06", 2025-09-27 00:42:12.630855 | orchestrator |  "vg_name": "ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06" 2025-09-27 00:42:12.630866 | orchestrator |  }, 2025-09-27 00:42:12.630876 | orchestrator |  { 2025-09-27 00:42:12.630887 | orchestrator |  "lv_name": "osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de", 2025-09-27 00:42:12.630898 | orchestrator |  "vg_name": "ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de" 2025-09-27 00:42:12.630908 | orchestrator |  } 2025-09-27 00:42:12.630919 | orchestrator |  ], 2025-09-27 00:42:12.630930 | orchestrator |  "pv": [ 2025-09-27 00:42:12.630940 | orchestrator |  { 2025-09-27 00:42:12.630951 | orchestrator |  "pv_name": "/dev/sdb", 2025-09-27 00:42:12.630961 | orchestrator |  "vg_name": "ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06" 2025-09-27 00:42:12.630972 | orchestrator |  }, 2025-09-27 00:42:12.630982 | orchestrator |  { 2025-09-27 00:42:12.630993 | orchestrator |  "pv_name": "/dev/sdc", 2025-09-27 00:42:12.631011 | orchestrator |  "vg_name": "ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de" 2025-09-27 00:42:12.631029 | orchestrator |  } 2025-09-27 00:42:12.631047 | orchestrator |  ] 2025-09-27 00:42:12.631074 | orchestrator |  } 2025-09-27 00:42:12.631097 | orchestrator | } 2025-09-27 00:42:12.631116 | orchestrator | 2025-09-27 00:42:12.631135 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:42:12.631170 | orchestrator | testbed-node-3 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-09-27 00:42:12.631189 | orchestrator | testbed-node-4 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-09-27 00:42:12.631231 | orchestrator | testbed-node-5 : ok=51  changed=2  unreachable=0 failed=0 skipped=62  rescued=0 ignored=0 2025-09-27 00:42:12.631243 | orchestrator | 2025-09-27 00:42:12.631254 | orchestrator | 2025-09-27 00:42:12.631264 | orchestrator | 2025-09-27 00:42:12.631275 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:42:12.631286 | orchestrator | Saturday 27 September 2025 00:42:12 +0000 (0:00:00.140) 0:01:11.685 **** 2025-09-27 00:42:12.631296 | orchestrator | =============================================================================== 2025-09-27 00:42:12.631307 | orchestrator | Create block VGs -------------------------------------------------------- 5.76s 2025-09-27 00:42:12.631318 | orchestrator | Create block LVs -------------------------------------------------------- 4.16s 2025-09-27 00:42:12.631329 | orchestrator | Gather DB VGs with total and available size in bytes -------------------- 1.82s 2025-09-27 00:42:12.631339 | orchestrator | Get list of Ceph PVs with associated VGs -------------------------------- 1.65s 2025-09-27 00:42:12.631350 | orchestrator | Get list of Ceph LVs with associated VGs -------------------------------- 1.65s 2025-09-27 00:42:12.631360 | orchestrator | Gather WAL VGs with total and available size in bytes ------------------- 1.63s 2025-09-27 00:42:12.631371 | orchestrator | Gather DB+WAL VGs with total and available size in bytes ---------------- 1.55s 2025-09-27 00:42:12.631382 | orchestrator | Add known partitions to the list of available block devices ------------- 1.54s 2025-09-27 00:42:12.631405 | orchestrator | Add known links to the list of available block devices ------------------ 1.15s 2025-09-27 00:42:12.976031 | orchestrator | Add known partitions to the list of available block devices ------------- 1.15s 2025-09-27 00:42:12.976133 | orchestrator | Print LVM report data --------------------------------------------------- 0.93s 2025-09-27 00:42:12.976148 | orchestrator | Add known partitions to the list of available block devices ------------- 0.86s 2025-09-27 00:42:12.976160 | orchestrator | Add known links to the list of available block devices ------------------ 0.78s 2025-09-27 00:42:12.976172 | orchestrator | Fail if DB LV defined in lvm_volumes is missing ------------------------- 0.77s 2025-09-27 00:42:12.976183 | orchestrator | Add known partitions to the list of available block devices ------------- 0.77s 2025-09-27 00:42:12.976195 | orchestrator | Create DB LVs for ceph_db_wal_devices ----------------------------------- 0.72s 2025-09-27 00:42:12.976261 | orchestrator | Get extra vars for Ceph configuration ----------------------------------- 0.71s 2025-09-27 00:42:12.976273 | orchestrator | Add known partitions to the list of available block devices ------------- 0.68s 2025-09-27 00:42:12.976284 | orchestrator | Add known partitions to the list of available block devices ------------- 0.67s 2025-09-27 00:42:12.976295 | orchestrator | Combine JSON from _db/wal/db_wal_vgs_cmd_output ------------------------- 0.66s 2025-09-27 00:42:25.216728 | orchestrator | 2025-09-27 00:42:25 | INFO  | Task 88ae194e-9fdb-4c6b-92d1-8c1a38c20e34 (facts) was prepared for execution. 2025-09-27 00:42:25.217447 | orchestrator | 2025-09-27 00:42:25 | INFO  | It takes a moment until task 88ae194e-9fdb-4c6b-92d1-8c1a38c20e34 (facts) has been started and output is visible here. 2025-09-27 00:42:37.329676 | orchestrator | 2025-09-27 00:42:37.329792 | orchestrator | PLAY [Apply role facts] ******************************************************** 2025-09-27 00:42:37.329808 | orchestrator | 2025-09-27 00:42:37.329820 | orchestrator | TASK [osism.commons.facts : Create custom facts directory] ********************* 2025-09-27 00:42:37.329832 | orchestrator | Saturday 27 September 2025 00:42:29 +0000 (0:00:00.267) 0:00:00.267 **** 2025-09-27 00:42:37.329843 | orchestrator | ok: [testbed-manager] 2025-09-27 00:42:37.329855 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:42:37.329893 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:42:37.329905 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:42:37.329915 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:42:37.329926 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:42:37.329936 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:37.329947 | orchestrator | 2025-09-27 00:42:37.329957 | orchestrator | TASK [osism.commons.facts : Copy fact files] *********************************** 2025-09-27 00:42:37.329968 | orchestrator | Saturday 27 September 2025 00:42:30 +0000 (0:00:01.136) 0:00:01.404 **** 2025-09-27 00:42:37.329994 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:42:37.330006 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:42:37.330075 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:42:37.330088 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:42:37.330099 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:42:37.330109 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:42:37.330120 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:37.330130 | orchestrator | 2025-09-27 00:42:37.330141 | orchestrator | PLAY [Gather facts for all hosts] ********************************************** 2025-09-27 00:42:37.330152 | orchestrator | 2025-09-27 00:42:37.330162 | orchestrator | TASK [Gathers facts about hosts] *********************************************** 2025-09-27 00:42:37.330173 | orchestrator | Saturday 27 September 2025 00:42:31 +0000 (0:00:01.192) 0:00:02.596 **** 2025-09-27 00:42:37.330184 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:42:37.330194 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:42:37.330242 | orchestrator | ok: [testbed-manager] 2025-09-27 00:42:37.330255 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:42:37.330267 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:42:37.330279 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:42:37.330291 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:42:37.330303 | orchestrator | 2025-09-27 00:42:37.330315 | orchestrator | PLAY [Gather facts for all hosts if using --limit] ***************************** 2025-09-27 00:42:37.330327 | orchestrator | 2025-09-27 00:42:37.330339 | orchestrator | TASK [Gather facts for all hosts] ********************************************** 2025-09-27 00:42:37.330352 | orchestrator | Saturday 27 September 2025 00:42:36 +0000 (0:00:04.947) 0:00:07.543 **** 2025-09-27 00:42:37.330364 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:42:37.330375 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:42:37.330387 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:42:37.330399 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:42:37.330412 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:42:37.330424 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:42:37.330435 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:42:37.330446 | orchestrator | 2025-09-27 00:42:37.330459 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:42:37.330471 | orchestrator | testbed-manager : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:42:37.330485 | orchestrator | testbed-node-0 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:42:37.330497 | orchestrator | testbed-node-1 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:42:37.330509 | orchestrator | testbed-node-2 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:42:37.330521 | orchestrator | testbed-node-3 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:42:37.330534 | orchestrator | testbed-node-4 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:42:37.330546 | orchestrator | testbed-node-5 : ok=2  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:42:37.330568 | orchestrator | 2025-09-27 00:42:37.330580 | orchestrator | 2025-09-27 00:42:37.330590 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:42:37.330601 | orchestrator | Saturday 27 September 2025 00:42:36 +0000 (0:00:00.530) 0:00:08.074 **** 2025-09-27 00:42:37.330612 | orchestrator | =============================================================================== 2025-09-27 00:42:37.330623 | orchestrator | Gathers facts about hosts ----------------------------------------------- 4.95s 2025-09-27 00:42:37.330633 | orchestrator | osism.commons.facts : Copy fact files ----------------------------------- 1.19s 2025-09-27 00:42:37.330644 | orchestrator | osism.commons.facts : Create custom facts directory --------------------- 1.14s 2025-09-27 00:42:37.330655 | orchestrator | Gather facts for all hosts ---------------------------------------------- 0.53s 2025-09-27 00:42:49.599134 | orchestrator | 2025-09-27 00:42:49 | INFO  | Task ec506330-30bb-4921-81f2-d2a087934088 (frr) was prepared for execution. 2025-09-27 00:42:49.599312 | orchestrator | 2025-09-27 00:42:49 | INFO  | It takes a moment until task ec506330-30bb-4921-81f2-d2a087934088 (frr) has been started and output is visible here. 2025-09-27 00:43:15.131079 | orchestrator | 2025-09-27 00:43:15.131196 | orchestrator | PLAY [Apply role frr] ********************************************************** 2025-09-27 00:43:15.131266 | orchestrator | 2025-09-27 00:43:15.131279 | orchestrator | TASK [osism.services.frr : Include distribution specific install tasks] ******** 2025-09-27 00:43:15.131291 | orchestrator | Saturday 27 September 2025 00:42:53 +0000 (0:00:00.233) 0:00:00.233 **** 2025-09-27 00:43:15.131302 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/frr/tasks/install-Debian-family.yml for testbed-manager 2025-09-27 00:43:15.131315 | orchestrator | 2025-09-27 00:43:15.131326 | orchestrator | TASK [osism.services.frr : Pin frr package version] **************************** 2025-09-27 00:43:15.131337 | orchestrator | Saturday 27 September 2025 00:42:53 +0000 (0:00:00.223) 0:00:00.456 **** 2025-09-27 00:43:15.131348 | orchestrator | changed: [testbed-manager] 2025-09-27 00:43:15.131360 | orchestrator | 2025-09-27 00:43:15.131371 | orchestrator | TASK [osism.services.frr : Install frr package] ******************************** 2025-09-27 00:43:15.131382 | orchestrator | Saturday 27 September 2025 00:42:55 +0000 (0:00:01.165) 0:00:01.621 **** 2025-09-27 00:43:15.131392 | orchestrator | changed: [testbed-manager] 2025-09-27 00:43:15.131403 | orchestrator | 2025-09-27 00:43:15.131432 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/vtysh.conf] ********************* 2025-09-27 00:43:15.131443 | orchestrator | Saturday 27 September 2025 00:43:04 +0000 (0:00:09.669) 0:00:11.291 **** 2025-09-27 00:43:15.131454 | orchestrator | ok: [testbed-manager] 2025-09-27 00:43:15.131466 | orchestrator | 2025-09-27 00:43:15.131476 | orchestrator | TASK [osism.services.frr : Copy file: /etc/frr/daemons] ************************ 2025-09-27 00:43:15.131487 | orchestrator | Saturday 27 September 2025 00:43:05 +0000 (0:00:01.267) 0:00:12.559 **** 2025-09-27 00:43:15.131498 | orchestrator | changed: [testbed-manager] 2025-09-27 00:43:15.131509 | orchestrator | 2025-09-27 00:43:15.131519 | orchestrator | TASK [osism.services.frr : Set _frr_uplinks fact] ****************************** 2025-09-27 00:43:15.131530 | orchestrator | Saturday 27 September 2025 00:43:06 +0000 (0:00:00.950) 0:00:13.510 **** 2025-09-27 00:43:15.131541 | orchestrator | ok: [testbed-manager] 2025-09-27 00:43:15.131552 | orchestrator | 2025-09-27 00:43:15.131562 | orchestrator | TASK [osism.services.frr : Check for frr.conf file in the configuration repository] *** 2025-09-27 00:43:15.131574 | orchestrator | Saturday 27 September 2025 00:43:08 +0000 (0:00:01.169) 0:00:14.680 **** 2025-09-27 00:43:15.131585 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-27 00:43:15.131595 | orchestrator | 2025-09-27 00:43:15.131607 | orchestrator | TASK [osism.services.frr : Copy file from the configuration repository: /etc/frr/frr.conf] *** 2025-09-27 00:43:15.131620 | orchestrator | Saturday 27 September 2025 00:43:08 +0000 (0:00:00.785) 0:00:15.465 **** 2025-09-27 00:43:15.131632 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:43:15.131644 | orchestrator | 2025-09-27 00:43:15.131657 | orchestrator | TASK [osism.services.frr : Copy file from the role: /etc/frr/frr.conf] ********* 2025-09-27 00:43:15.131689 | orchestrator | Saturday 27 September 2025 00:43:09 +0000 (0:00:00.169) 0:00:15.635 **** 2025-09-27 00:43:15.131701 | orchestrator | changed: [testbed-manager] 2025-09-27 00:43:15.131714 | orchestrator | 2025-09-27 00:43:15.131726 | orchestrator | TASK [osism.services.frr : Set sysctl parameters] ****************************** 2025-09-27 00:43:15.131740 | orchestrator | Saturday 27 September 2025 00:43:10 +0000 (0:00:00.934) 0:00:16.569 **** 2025-09-27 00:43:15.131752 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.ip_forward', 'value': 1}) 2025-09-27 00:43:15.131765 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.send_redirects', 'value': 0}) 2025-09-27 00:43:15.131778 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.accept_redirects', 'value': 0}) 2025-09-27 00:43:15.131791 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.fib_multipath_hash_policy', 'value': 1}) 2025-09-27 00:43:15.131803 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.default.ignore_routes_with_linkdown', 'value': 1}) 2025-09-27 00:43:15.131816 | orchestrator | changed: [testbed-manager] => (item={'name': 'net.ipv4.conf.all.rp_filter', 'value': 2}) 2025-09-27 00:43:15.131828 | orchestrator | 2025-09-27 00:43:15.131840 | orchestrator | TASK [osism.services.frr : Manage frr service] ********************************* 2025-09-27 00:43:15.131853 | orchestrator | Saturday 27 September 2025 00:43:12 +0000 (0:00:02.170) 0:00:18.740 **** 2025-09-27 00:43:15.131865 | orchestrator | ok: [testbed-manager] 2025-09-27 00:43:15.131877 | orchestrator | 2025-09-27 00:43:15.131890 | orchestrator | RUNNING HANDLER [osism.services.frr : Restart frr service] ********************* 2025-09-27 00:43:15.131902 | orchestrator | Saturday 27 September 2025 00:43:13 +0000 (0:00:01.321) 0:00:20.062 **** 2025-09-27 00:43:15.131914 | orchestrator | changed: [testbed-manager] 2025-09-27 00:43:15.131927 | orchestrator | 2025-09-27 00:43:15.131940 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:43:15.131952 | orchestrator | testbed-manager : ok=11  changed=6  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 00:43:15.131965 | orchestrator | 2025-09-27 00:43:15.131975 | orchestrator | 2025-09-27 00:43:15.131986 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:43:15.131996 | orchestrator | Saturday 27 September 2025 00:43:14 +0000 (0:00:01.381) 0:00:21.444 **** 2025-09-27 00:43:15.132007 | orchestrator | =============================================================================== 2025-09-27 00:43:15.132018 | orchestrator | osism.services.frr : Install frr package -------------------------------- 9.67s 2025-09-27 00:43:15.132029 | orchestrator | osism.services.frr : Set sysctl parameters ------------------------------ 2.17s 2025-09-27 00:43:15.132039 | orchestrator | osism.services.frr : Restart frr service -------------------------------- 1.38s 2025-09-27 00:43:15.132050 | orchestrator | osism.services.frr : Manage frr service --------------------------------- 1.32s 2025-09-27 00:43:15.132078 | orchestrator | osism.services.frr : Copy file: /etc/frr/vtysh.conf --------------------- 1.27s 2025-09-27 00:43:15.132090 | orchestrator | osism.services.frr : Set _frr_uplinks fact ------------------------------ 1.17s 2025-09-27 00:43:15.132101 | orchestrator | osism.services.frr : Pin frr package version ---------------------------- 1.17s 2025-09-27 00:43:15.132111 | orchestrator | osism.services.frr : Copy file: /etc/frr/daemons ------------------------ 0.95s 2025-09-27 00:43:15.132122 | orchestrator | osism.services.frr : Copy file from the role: /etc/frr/frr.conf --------- 0.93s 2025-09-27 00:43:15.132133 | orchestrator | osism.services.frr : Check for frr.conf file in the configuration repository --- 0.79s 2025-09-27 00:43:15.132143 | orchestrator | osism.services.frr : Include distribution specific install tasks -------- 0.22s 2025-09-27 00:43:15.132154 | orchestrator | osism.services.frr : Copy file from the configuration repository: /etc/frr/frr.conf --- 0.17s 2025-09-27 00:43:15.421662 | orchestrator | 2025-09-27 00:43:15.425086 | orchestrator | --> DEPLOY IN A NUTSHELL -- START -- Sat Sep 27 00:43:15 UTC 2025 2025-09-27 00:43:15.425159 | orchestrator | 2025-09-27 00:43:17.279841 | orchestrator | 2025-09-27 00:43:17 | INFO  | Collection nutshell is prepared for execution 2025-09-27 00:43:17.279943 | orchestrator | 2025-09-27 00:43:17 | INFO  | D [0] - dotfiles 2025-09-27 00:43:27.451755 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [0] - homer 2025-09-27 00:43:27.451849 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [0] - netdata 2025-09-27 00:43:27.451860 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [0] - openstackclient 2025-09-27 00:43:27.451869 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [0] - phpmyadmin 2025-09-27 00:43:27.451876 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [0] - common 2025-09-27 00:43:27.455589 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [1] -- loadbalancer 2025-09-27 00:43:27.455740 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [2] --- opensearch 2025-09-27 00:43:27.455756 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [2] --- mariadb-ng 2025-09-27 00:43:27.455999 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [3] ---- horizon 2025-09-27 00:43:27.456486 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [3] ---- keystone 2025-09-27 00:43:27.456795 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [4] ----- neutron 2025-09-27 00:43:27.456811 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [5] ------ wait-for-nova 2025-09-27 00:43:27.457291 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [5] ------ octavia 2025-09-27 00:43:27.459081 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [4] ----- barbican 2025-09-27 00:43:27.459286 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [4] ----- designate 2025-09-27 00:43:27.459379 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [4] ----- ironic 2025-09-27 00:43:27.459408 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [4] ----- placement 2025-09-27 00:43:27.459420 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [4] ----- magnum 2025-09-27 00:43:27.460622 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [1] -- openvswitch 2025-09-27 00:43:27.460644 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [2] --- ovn 2025-09-27 00:43:27.460822 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [1] -- memcached 2025-09-27 00:43:27.461079 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [1] -- redis 2025-09-27 00:43:27.461100 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [1] -- rabbitmq-ng 2025-09-27 00:43:27.461677 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [0] - kubernetes 2025-09-27 00:43:27.464107 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [1] -- kubeconfig 2025-09-27 00:43:27.464129 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [1] -- copy-kubeconfig 2025-09-27 00:43:27.464386 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [0] - ceph 2025-09-27 00:43:27.466831 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [1] -- ceph-pools 2025-09-27 00:43:27.466852 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [2] --- copy-ceph-keys 2025-09-27 00:43:27.466863 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [3] ---- cephclient 2025-09-27 00:43:27.466875 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [4] ----- ceph-bootstrap-dashboard 2025-09-27 00:43:27.466962 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [4] ----- wait-for-keystone 2025-09-27 00:43:27.467132 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [5] ------ kolla-ceph-rgw 2025-09-27 00:43:27.467151 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [5] ------ glance 2025-09-27 00:43:27.467733 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [5] ------ cinder 2025-09-27 00:43:27.467822 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [5] ------ nova 2025-09-27 00:43:27.467866 | orchestrator | 2025-09-27 00:43:27 | INFO  | A [4] ----- prometheus 2025-09-27 00:43:27.467957 | orchestrator | 2025-09-27 00:43:27 | INFO  | D [5] ------ grafana 2025-09-27 00:43:27.648073 | orchestrator | 2025-09-27 00:43:27 | INFO  | All tasks of the collection nutshell are prepared for execution 2025-09-27 00:43:27.648156 | orchestrator | 2025-09-27 00:43:27 | INFO  | Tasks are running in the background 2025-09-27 00:43:30.481673 | orchestrator | 2025-09-27 00:43:30 | INFO  | No task IDs specified, wait for all currently running tasks 2025-09-27 00:43:32.598416 | orchestrator | 2025-09-27 00:43:32 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:32.598775 | orchestrator | 2025-09-27 00:43:32 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:32.600239 | orchestrator | 2025-09-27 00:43:32 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state STARTED 2025-09-27 00:43:32.600663 | orchestrator | 2025-09-27 00:43:32 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:32.601184 | orchestrator | 2025-09-27 00:43:32 | INFO  | Task 4bfdb937-fc0d-4214-bdfb-fc0928ebfd78 is in state STARTED 2025-09-27 00:43:32.609176 | orchestrator | 2025-09-27 00:43:32 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:32.609714 | orchestrator | 2025-09-27 00:43:32 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:32.609738 | orchestrator | 2025-09-27 00:43:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:43:35.642857 | orchestrator | 2025-09-27 00:43:35 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:35.642960 | orchestrator | 2025-09-27 00:43:35 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:35.643475 | orchestrator | 2025-09-27 00:43:35 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state STARTED 2025-09-27 00:43:35.643885 | orchestrator | 2025-09-27 00:43:35 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:35.644418 | orchestrator | 2025-09-27 00:43:35 | INFO  | Task 4bfdb937-fc0d-4214-bdfb-fc0928ebfd78 is in state STARTED 2025-09-27 00:43:35.644951 | orchestrator | 2025-09-27 00:43:35 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:35.645514 | orchestrator | 2025-09-27 00:43:35 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:35.646148 | orchestrator | 2025-09-27 00:43:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:43:38.953029 | orchestrator | 2025-09-27 00:43:38 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:38.953136 | orchestrator | 2025-09-27 00:43:38 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:38.953762 | orchestrator | 2025-09-27 00:43:38 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state STARTED 2025-09-27 00:43:38.954160 | orchestrator | 2025-09-27 00:43:38 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:38.954732 | orchestrator | 2025-09-27 00:43:38 | INFO  | Task 4bfdb937-fc0d-4214-bdfb-fc0928ebfd78 is in state STARTED 2025-09-27 00:43:38.955269 | orchestrator | 2025-09-27 00:43:38 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:38.956274 | orchestrator | 2025-09-27 00:43:38 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:38.956296 | orchestrator | 2025-09-27 00:43:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:43:42.113723 | orchestrator | 2025-09-27 00:43:42 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:42.116334 | orchestrator | 2025-09-27 00:43:42 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:42.117576 | orchestrator | 2025-09-27 00:43:42 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state STARTED 2025-09-27 00:43:42.117601 | orchestrator | 2025-09-27 00:43:42 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:42.117919 | orchestrator | 2025-09-27 00:43:42 | INFO  | Task 4bfdb937-fc0d-4214-bdfb-fc0928ebfd78 is in state STARTED 2025-09-27 00:43:42.119015 | orchestrator | 2025-09-27 00:43:42 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:42.134567 | orchestrator | 2025-09-27 00:43:42 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:42.134590 | orchestrator | 2025-09-27 00:43:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:43:45.276249 | orchestrator | 2025-09-27 00:43:45 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:45.276359 | orchestrator | 2025-09-27 00:43:45 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:45.276376 | orchestrator | 2025-09-27 00:43:45 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state STARTED 2025-09-27 00:43:45.276388 | orchestrator | 2025-09-27 00:43:45 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:45.276399 | orchestrator | 2025-09-27 00:43:45 | INFO  | Task 4bfdb937-fc0d-4214-bdfb-fc0928ebfd78 is in state STARTED 2025-09-27 00:43:45.276410 | orchestrator | 2025-09-27 00:43:45 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:45.276420 | orchestrator | 2025-09-27 00:43:45 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:45.276432 | orchestrator | 2025-09-27 00:43:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:43:48.307078 | orchestrator | 2025-09-27 00:43:48 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:48.307662 | orchestrator | 2025-09-27 00:43:48 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:48.308678 | orchestrator | 2025-09-27 00:43:48 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state STARTED 2025-09-27 00:43:48.310490 | orchestrator | 2025-09-27 00:43:48 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:48.311192 | orchestrator | 2025-09-27 00:43:48 | INFO  | Task 4bfdb937-fc0d-4214-bdfb-fc0928ebfd78 is in state STARTED 2025-09-27 00:43:48.312017 | orchestrator | 2025-09-27 00:43:48 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:48.313001 | orchestrator | 2025-09-27 00:43:48 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:48.313030 | orchestrator | 2025-09-27 00:43:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:43:51.485548 | orchestrator | 2025-09-27 00:43:51 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:51.486589 | orchestrator | 2025-09-27 00:43:51 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:51.487988 | orchestrator | 2025-09-27 00:43:51 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state STARTED 2025-09-27 00:43:51.491519 | orchestrator | 2025-09-27 00:43:51 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:51.493743 | orchestrator | 2025-09-27 00:43:51 | INFO  | Task 4bfdb937-fc0d-4214-bdfb-fc0928ebfd78 is in state STARTED 2025-09-27 00:43:51.495383 | orchestrator | 2025-09-27 00:43:51 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:51.495997 | orchestrator | 2025-09-27 00:43:51 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:51.496019 | orchestrator | 2025-09-27 00:43:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:43:54.554388 | orchestrator | 2025-09-27 00:43:54 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:54.554601 | orchestrator | 2025-09-27 00:43:54 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:54.555185 | orchestrator | 2025-09-27 00:43:54 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state STARTED 2025-09-27 00:43:54.555910 | orchestrator | 2025-09-27 00:43:54 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:43:54.557268 | orchestrator | 2025-09-27 00:43:54 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:54.558533 | orchestrator | 2025-09-27 00:43:54 | INFO  | Task 4bfdb937-fc0d-4214-bdfb-fc0928ebfd78 is in state SUCCESS 2025-09-27 00:43:54.558916 | orchestrator | 2025-09-27 00:43:54.558942 | orchestrator | PLAY [Apply role geerlingguy.dotfiles] ***************************************** 2025-09-27 00:43:54.558955 | orchestrator | 2025-09-27 00:43:54.558965 | orchestrator | TASK [geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally.] **** 2025-09-27 00:43:54.558977 | orchestrator | Saturday 27 September 2025 00:43:38 +0000 (0:00:00.454) 0:00:00.454 **** 2025-09-27 00:43:54.558987 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:43:54.558999 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:43:54.559010 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:43:54.559020 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:43:54.559031 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:43:54.559041 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:43:54.559052 | orchestrator | changed: [testbed-manager] 2025-09-27 00:43:54.559062 | orchestrator | 2025-09-27 00:43:54.559073 | orchestrator | TASK [geerlingguy.dotfiles : Ensure all configured dotfiles are links.] ******** 2025-09-27 00:43:54.559084 | orchestrator | Saturday 27 September 2025 00:43:42 +0000 (0:00:03.846) 0:00:04.300 **** 2025-09-27 00:43:54.559095 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2025-09-27 00:43:54.559106 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2025-09-27 00:43:54.559116 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2025-09-27 00:43:54.559127 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2025-09-27 00:43:54.559137 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2025-09-27 00:43:54.559148 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2025-09-27 00:43:54.559159 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2025-09-27 00:43:54.559170 | orchestrator | 2025-09-27 00:43:54.559180 | orchestrator | TASK [geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked.] *** 2025-09-27 00:43:54.559191 | orchestrator | Saturday 27 September 2025 00:43:44 +0000 (0:00:01.914) 0:00:06.215 **** 2025-09-27 00:43:54.559247 | orchestrator | ok: [testbed-node-0] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-27 00:43:42.741424', 'end': '2025-09-27 00:43:42.751680', 'delta': '0:00:00.010256', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-27 00:43:54.559289 | orchestrator | ok: [testbed-manager] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-27 00:43:42.948103', 'end': '2025-09-27 00:43:42.952345', 'delta': '0:00:00.004242', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-27 00:43:54.559301 | orchestrator | ok: [testbed-node-1] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-27 00:43:42.965902', 'end': '2025-09-27 00:43:42.973926', 'delta': '0:00:00.008024', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-27 00:43:54.559336 | orchestrator | ok: [testbed-node-2] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-27 00:43:43.138644', 'end': '2025-09-27 00:43:43.146516', 'delta': '0:00:00.007872', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-27 00:43:54.559348 | orchestrator | ok: [testbed-node-3] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-27 00:43:43.357228', 'end': '2025-09-27 00:43:43.367301', 'delta': '0:00:00.010073', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-27 00:43:54.559365 | orchestrator | ok: [testbed-node-4] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-27 00:43:43.631174', 'end': '2025-09-27 00:43:43.641616', 'delta': '0:00:00.010442', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-27 00:43:54.559389 | orchestrator | ok: [testbed-node-5] => (item=[0, {'changed': False, 'stdout': '', 'stderr': "ls: cannot access '/home/dragon/.tmux.conf': No such file or directory", 'rc': 2, 'cmd': ['ls', '-F', '~/.tmux.conf'], 'start': '2025-09-27 00:43:44.002879', 'end': '2025-09-27 00:43:44.010733', 'delta': '0:00:00.007854', 'failed': False, 'msg': 'non-zero return code', 'invocation': {'module_args': {'_raw_params': 'ls -F ~/.tmux.conf', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': ["ls: cannot access '/home/dragon/.tmux.conf': No such file or directory"], 'failed_when_result': False, 'item': '.tmux.conf', 'ansible_loop_var': 'item'}]) 2025-09-27 00:43:54.559401 | orchestrator | 2025-09-27 00:43:54.559412 | orchestrator | TASK [geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist.] **** 2025-09-27 00:43:54.559423 | orchestrator | Saturday 27 September 2025 00:43:46 +0000 (0:00:02.167) 0:00:08.382 **** 2025-09-27 00:43:54.559433 | orchestrator | ok: [testbed-node-0] => (item=.tmux.conf) 2025-09-27 00:43:54.559444 | orchestrator | ok: [testbed-manager] => (item=.tmux.conf) 2025-09-27 00:43:54.559455 | orchestrator | ok: [testbed-node-1] => (item=.tmux.conf) 2025-09-27 00:43:54.559465 | orchestrator | ok: [testbed-node-2] => (item=.tmux.conf) 2025-09-27 00:43:54.559475 | orchestrator | ok: [testbed-node-3] => (item=.tmux.conf) 2025-09-27 00:43:54.559486 | orchestrator | ok: [testbed-node-4] => (item=.tmux.conf) 2025-09-27 00:43:54.559496 | orchestrator | ok: [testbed-node-5] => (item=.tmux.conf) 2025-09-27 00:43:54.559509 | orchestrator | 2025-09-27 00:43:54.559521 | orchestrator | TASK [geerlingguy.dotfiles : Link dotfiles into home folder.] ****************** 2025-09-27 00:43:54.559533 | orchestrator | Saturday 27 September 2025 00:43:48 +0000 (0:00:01.812) 0:00:10.194 **** 2025-09-27 00:43:54.559546 | orchestrator | changed: [testbed-manager] => (item=.tmux.conf) 2025-09-27 00:43:54.559558 | orchestrator | changed: [testbed-node-1] => (item=.tmux.conf) 2025-09-27 00:43:54.559570 | orchestrator | changed: [testbed-node-0] => (item=.tmux.conf) 2025-09-27 00:43:54.559583 | orchestrator | changed: [testbed-node-2] => (item=.tmux.conf) 2025-09-27 00:43:54.559596 | orchestrator | changed: [testbed-node-4] => (item=.tmux.conf) 2025-09-27 00:43:54.559608 | orchestrator | changed: [testbed-node-3] => (item=.tmux.conf) 2025-09-27 00:43:54.559620 | orchestrator | changed: [testbed-node-5] => (item=.tmux.conf) 2025-09-27 00:43:54.559633 | orchestrator | 2025-09-27 00:43:54.559645 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:43:54.559664 | orchestrator | testbed-manager : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:43:54.559678 | orchestrator | testbed-node-0 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:43:54.559691 | orchestrator | testbed-node-1 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:43:54.559703 | orchestrator | testbed-node-2 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:43:54.559716 | orchestrator | testbed-node-3 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:43:54.559728 | orchestrator | testbed-node-4 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:43:54.559748 | orchestrator | testbed-node-5 : ok=5  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:43:54.559760 | orchestrator | 2025-09-27 00:43:54.559772 | orchestrator | 2025-09-27 00:43:54.559784 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:43:54.559797 | orchestrator | Saturday 27 September 2025 00:43:51 +0000 (0:00:03.249) 0:00:13.444 **** 2025-09-27 00:43:54.559809 | orchestrator | =============================================================================== 2025-09-27 00:43:54.559821 | orchestrator | geerlingguy.dotfiles : Ensure dotfiles repository is cloned locally. ---- 3.85s 2025-09-27 00:43:54.559834 | orchestrator | geerlingguy.dotfiles : Link dotfiles into home folder. ------------------ 3.25s 2025-09-27 00:43:54.559846 | orchestrator | geerlingguy.dotfiles : Remove existing dotfiles file if a replacement is being linked. --- 2.17s 2025-09-27 00:43:54.559859 | orchestrator | geerlingguy.dotfiles : Ensure all configured dotfiles are links. -------- 1.91s 2025-09-27 00:43:54.559875 | orchestrator | geerlingguy.dotfiles : Ensure parent folders of link dotfiles exist. ---- 1.81s 2025-09-27 00:43:54.560161 | orchestrator | 2025-09-27 00:43:54 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:54.561774 | orchestrator | 2025-09-27 00:43:54 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:54.561802 | orchestrator | 2025-09-27 00:43:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:43:57.737606 | orchestrator | 2025-09-27 00:43:57.737700 | orchestrator | 2025-09-27 00:43:57.737714 | orchestrator | PLAY [Apply role common] ******************************************************* 2025-09-27 00:43:57.737724 | orchestrator | 2025-09-27 00:43:57.737734 | orchestrator | TASK [common : include_tasks] ************************************************** 2025-09-27 00:43:57.737744 | orchestrator | Saturday 27 September 2025 00:43:32 +0000 (0:00:00.233) 0:00:00.233 **** 2025-09-27 00:43:57.737755 | orchestrator | included: /ansible/roles/common/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:43:57.737766 | orchestrator | 2025-09-27 00:43:57.737776 | orchestrator | TASK [common : Ensuring config directories exist] ****************************** 2025-09-27 00:43:57.737786 | orchestrator | Saturday 27 September 2025 00:43:33 +0000 (0:00:01.088) 0:00:01.322 **** 2025-09-27 00:43:57.737795 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-27 00:43:57.737805 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-27 00:43:57.737815 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-27 00:43:57.737824 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-27 00:43:57.737834 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-27 00:43:57.737843 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-27 00:43:57.737852 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-27 00:43:57.737862 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-27 00:43:57.737871 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-27 00:43:57.737906 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-27 00:43:57.737917 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'cron'}, 'cron']) 2025-09-27 00:43:57.737927 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-27 00:43:57.737937 | orchestrator | changed: [testbed-node-0] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-27 00:43:57.737947 | orchestrator | changed: [testbed-manager] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-27 00:43:57.737976 | orchestrator | changed: [testbed-node-1] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-27 00:43:57.737986 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-27 00:43:57.737997 | orchestrator | changed: [testbed-node-3] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-27 00:43:57.738007 | orchestrator | changed: [testbed-node-2] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-27 00:43:57.738090 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'fluentd'}, 'fluentd']) 2025-09-27 00:43:57.738134 | orchestrator | changed: [testbed-node-5] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-27 00:43:57.738148 | orchestrator | changed: [testbed-node-4] => (item=[{'service_name': 'kolla-toolbox'}, 'kolla-toolbox']) 2025-09-27 00:43:57.738159 | orchestrator | 2025-09-27 00:43:57.738170 | orchestrator | TASK [common : include_tasks] ************************************************** 2025-09-27 00:43:57.738181 | orchestrator | Saturday 27 September 2025 00:43:37 +0000 (0:00:03.999) 0:00:05.322 **** 2025-09-27 00:43:57.738194 | orchestrator | included: /ansible/roles/common/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:43:57.738227 | orchestrator | 2025-09-27 00:43:57.738238 | orchestrator | TASK [service-cert-copy : common | Copying over extra CA certificates] ********* 2025-09-27 00:43:57.738248 | orchestrator | Saturday 27 September 2025 00:43:38 +0000 (0:00:01.189) 0:00:06.511 **** 2025-09-27 00:43:57.738263 | orchestrator | changed: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-27 00:43:57.738279 | orchestrator | changed: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-27 00:43:57.738348 | orchestrator | changed: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-27 00:43:57.738363 | orchestrator | changed: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-27 00:43:57.738374 | orchestrator | changed: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-27 00:43:57.738396 | orchestrator | changed: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-27 00:43:57.738414 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738684 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738696 | orchestrator | changed: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) 2025-09-27 00:43:57.738757 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738769 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738783 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738801 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738813 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738834 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738844 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738854 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738875 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738886 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738895 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738917 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.738927 | orchestrator | 2025-09-27 00:43:57.738937 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS certificate] *** 2025-09-27 00:43:57.738947 | orchestrator | Saturday 27 September 2025 00:43:43 +0000 (0:00:05.643) 0:00:12.154 **** 2025-09-27 00:43:57.738958 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.738968 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.738979 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.738989 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739010 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739020 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739037 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:43:57.739052 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739062 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739073 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739083 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:43:57.739092 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:43:57.739102 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739112 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739122 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739144 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739162 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739179 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739190 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739200 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739269 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739280 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:43:57.739289 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:43:57.739299 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:43:57.739309 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739326 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739343 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739353 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:43:57.739363 | orchestrator | 2025-09-27 00:43:57.739373 | orchestrator | TASK [service-cert-copy : common | Copying over backend internal TLS key] ****** 2025-09-27 00:43:57.739382 | orchestrator | Saturday 27 September 2025 00:43:45 +0000 (0:00:01.792) 0:00:13.946 **** 2025-09-27 00:43:57.739397 | orchestrator | skipping: [testbed-manager] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739408 | orchestrator | skipping: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739418 | orchestrator | skipping: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739428 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739438 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739448 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739464 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:43:57.739473 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:43:57.739495 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739506 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739516 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739526 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:43:57.739536 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739546 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739577 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739587 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:43:57.739597 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739619 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/r2025-09-27 00:43:57 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:43:57.739630 | orchestrator | 2025-09-27 00:43:57 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:43:57.739640 | orchestrator | 2025-09-27 00:43:57 | INFO  | Task 6dbd6b5c-7572-4a69-b3c9-005adc329209 is in state SUCCESS 2025-09-27 00:43:57.739650 | orchestrator | un/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739660 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739670 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:43:57.739684 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739695 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739705 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739715 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:43:57.739724 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}})  2025-09-27 00:43:57.739735 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739751 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:43:57.739761 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:43:57.739771 | orchestrator | 2025-09-27 00:43:57.739785 | orchestrator | TASK [common : Copying over /run subdirectories conf] ************************** 2025-09-27 00:43:57.739793 | orchestrator | Saturday 27 September 2025 00:43:47 +0000 (0:00:01.852) 0:00:15.799 **** 2025-09-27 00:43:57.739801 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:43:57.739809 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:43:57.739817 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:43:57.739825 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:43:57.739832 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:43:57.739840 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:43:57.739848 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:43:57.739855 | orchestrator | 2025-09-27 00:43:57.739863 | orchestrator | TASK [common : Restart systemd-tmpfiles] *************************************** 2025-09-27 00:43:57.739871 | orchestrator | Saturday 27 September 2025 00:43:48 +0000 (0:00:01.300) 0:00:17.100 **** 2025-09-27 00:43:57.739879 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:43:57.739886 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:43:57.739894 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:43:57.739902 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:43:57.739909 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:43:57.739917 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:43:57.739925 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:43:57.739932 | orchestrator | 2025-09-27 00:43:57.739940 | orchestrator | TASK [common : Copying over config.json files for services] ******************** 2025-09-27 00:43:57.739948 | orchestrator | Saturday 27 September 2025 00:43:50 +0000 (0:00:02.117) 0:00:19.217 **** 2025-09-27 00:43:57.739961 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-27 00:43:57.739982 | orchestrator | failed: [testbed-node-0] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-27 00:43:57.739996 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-27 00:43:57.740005 | orchestrator | failed: [testbed-node-1] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-27 00:43:57.740025 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-27 00:43:57.740039 | orchestrator | failed: [testbed-node-2] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-27 00:43:57.740058 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-27 00:43:57.740075 | orchestrator | failed: [testbed-manager] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-27 00:43:57.740093 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-27 00:43:57.740107 | orchestrator | failed: [testbed-node-3] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-27 00:43:57.740120 | orchestrator | changed: [testbed-node-0] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740129 | orchestrator | changed: [testbed-node-1] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740142 | orchestrator | changed: [testbed-node-2] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740151 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-27 00:43:57.740165 | orchestrator | failed: [testbed-node-4] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-27 00:43:57.740184 | orchestrator | An exception occurred during task execution. To see the full traceback, use -vvv. The error was: ansible.errors.AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': "{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': "{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': "{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': "{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined 2025-09-27 00:43:57.740199 | orchestrator | failed: [testbed-node-5] (item={'key': 'fluentd', 'value': {'container_name': 'fluentd', 'group': 'fluentd', 'enabled': True, 'image': 'registry.osism.tech/kolla/fluentd:2024.2', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'fluentd_data:/var/lib/fluentd/data/', '/var/log/journal:/var/log/journal:ro'], 'dimensions': {}}}) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "fluentd", "value": {"container_name": "fluentd", "dimensions": {}, "enabled": true, "environment": {"KOLLA_CONFIG_STRATEGY": "COPY_ALWAYS"}, "group": "fluentd", "image": "registry.osism.tech/kolla/fluentd:2024.2", "volumes": ["/etc/kolla/fluentd/:/var/lib/kolla/config_files/:ro", "/etc/localtime:/etc/localtime:ro", "/etc/timezone:/etc/timezone:ro", "kolla_logs:/var/log/kolla/", "fluentd_data:/var/lib/fluentd/data/", "/var/log/journal:/var/log/journal:ro"]}}, "msg": "AnsibleUndefinedVariable: [{'name': 'swift', 'enabled': \"{{ enable_swift | bool and (inventory_hostname in groups['swift-proxy-server'] or inventory_hostname in groups['swift-account-server'] or inventory_hostname in groups['swift-container-server'] or inventory_hostname in groups['swift-object-server']) }}\", 'facility': '{{ syslog_swift_facility }}', 'logdir': 'swift', 'logfile': 'swift_latest', 'output_tag': True, 'output_time': True}, {'name': 'haproxy', 'enabled': \"{{ enable_haproxy | bool and inventory_hostname in groups['loadbalancer'] }}\", 'facility': '{{ syslog_haproxy_facility }}', 'logdir': 'haproxy', 'logfile': 'haproxy_latest'}, {'name': 'glance_tls_proxy', 'enabled': \"{{ glance_enable_tls_backend | bool and inventory_hostname in groups['glance-api'] }}\", 'facility': '{{ syslog_glance_tls_proxy_facility }}', 'logdir': 'glance-tls-proxy', 'logfile': 'glance-tls-proxy'}, {'name': 'neutron_tls_proxy', 'enabled': \"{{ neutron_enable_tls_backend | bool and inventory_hostname in groups['neutron-server'] }}\", 'facility': '{{ syslog_neutron_tls_proxy_facility }}', 'logdir': 'neutron-tls-proxy', 'logfile': 'neutron-tls-proxy'}]: 'enable_swift' is undefined"} 2025-09-27 00:43:57.740256 | orchestrator | changed: [testbed-manager] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740271 | orchestrator | changed: [testbed-node-3] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740280 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740288 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740301 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740309 | orchestrator | changed: [testbed-node-4] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740323 | orchestrator | changed: [testbed-manager] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740331 | orchestrator | changed: [testbed-node-3] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740339 | orchestrator | changed: [testbed-node-5] => (item={'key': 'kolla-toolbox', 'value': {'container_name': 'kolla_toolbox', 'group': 'kolla-toolbox', 'enabled': True, 'image': 'registry.osism.tech/kolla/kolla-toolbox:2024.2', 'environment': {'ANSIBLE_NOCOLOR': '1', 'ANSIBLE_LIBRARY': '/usr/share/ansible', 'REQUESTS_CA_BUNDLE': '/etc/ssl/certs/ca-certificates.crt'}, 'privileged': True, 'volumes': ['/etc/kolla/kolla-toolbox/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run/:/run/:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740348 | orchestrator | changed: [testbed-node-4] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740361 | orchestrator | changed: [testbed-node-5] => (item={'key': 'cron', 'value': {'container_name': 'cron', 'group': 'cron', 'enabled': True, 'image': 'registry.osism.tech/kolla/cron:2024.2', 'environment': {'KOLLA_LOGROTATE_SCHEDULE': 'daily'}, 'volumes': ['/etc/kolla/cron/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:43:57.740369 | orchestrator | 2025-09-27 00:43:57.740376 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:43:57.740383 | orchestrator | testbed-manager : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-27 00:43:57.740390 | orchestrator | testbed-node-0 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-27 00:43:57.740397 | orchestrator | testbed-node-1 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-27 00:43:57.740404 | orchestrator | testbed-node-2 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-27 00:43:57.740410 | orchestrator | testbed-node-3 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-27 00:43:57.740420 | orchestrator | testbed-node-4 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-27 00:43:57.740432 | orchestrator | testbed-node-5 : ok=4  changed=2  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-27 00:43:57.740439 | orchestrator | 2025-09-27 00:43:57.740445 | orchestrator | 2025-09-27 00:43:57.740452 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:43:57.740459 | orchestrator | Saturday 27 September 2025 00:43:56 +0000 (0:00:05.289) 0:00:24.507 **** 2025-09-27 00:43:57.740466 | orchestrator | =============================================================================== 2025-09-27 00:43:57.740472 | orchestrator | service-cert-copy : common | Copying over extra CA certificates --------- 5.64s 2025-09-27 00:43:57.740479 | orchestrator | common : Copying over config.json files for services -------------------- 5.29s 2025-09-27 00:43:57.740486 | orchestrator | common : Ensuring config directories exist ------------------------------ 4.00s 2025-09-27 00:43:57.740493 | orchestrator | common : Restart systemd-tmpfiles --------------------------------------- 2.12s 2025-09-27 00:43:57.740499 | orchestrator | service-cert-copy : common | Copying over backend internal TLS key ------ 1.85s 2025-09-27 00:43:57.740506 | orchestrator | service-cert-copy : common | Copying over backend internal TLS certificate --- 1.79s 2025-09-27 00:43:57.740513 | orchestrator | common : Copying over /run subdirectories conf -------------------------- 1.30s 2025-09-27 00:43:57.740519 | orchestrator | common : include_tasks -------------------------------------------------- 1.19s 2025-09-27 00:43:57.740526 | orchestrator | common : include_tasks -------------------------------------------------- 1.09s 2025-09-27 00:43:57.740533 | orchestrator | 2025-09-27 00:43:57 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:43:57.740540 | orchestrator | 2025-09-27 00:43:57 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:43:57.740547 | orchestrator | 2025-09-27 00:43:57 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:43:57.740553 | orchestrator | 2025-09-27 00:43:57 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:43:57.740560 | orchestrator | 2025-09-27 00:43:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:00.980024 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:00.991880 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:44:00.991932 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:44:00.991944 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:00.991955 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:00.991966 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:00.991977 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:00.991988 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:00.991998 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:00.992009 | orchestrator | 2025-09-27 00:44:00 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:00.992020 | orchestrator | 2025-09-27 00:44:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:04.032961 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:04.034492 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:44:04.034541 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:44:04.036081 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:04.036195 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:04.037503 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:04.037545 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:04.038564 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:04.040430 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:04.042830 | orchestrator | 2025-09-27 00:44:04 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:04.042854 | orchestrator | 2025-09-27 00:44:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:07.095638 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:07.096076 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:44:07.097495 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:44:07.099998 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:07.106978 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:07.113423 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:07.116666 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:07.116693 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:07.118201 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:07.119840 | orchestrator | 2025-09-27 00:44:07 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:07.119861 | orchestrator | 2025-09-27 00:44:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:10.159236 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:10.159452 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:44:10.163827 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:44:10.163858 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:10.167919 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:10.167946 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:10.168202 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:10.176789 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:10.176813 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:10.178432 | orchestrator | 2025-09-27 00:44:10 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:10.178453 | orchestrator | 2025-09-27 00:44:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:13.252137 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:13.252275 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:44:13.252291 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state STARTED 2025-09-27 00:44:13.252303 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:13.252314 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:13.252324 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:13.252335 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:13.252346 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:13.252356 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:13.252387 | orchestrator | 2025-09-27 00:44:13 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:13.252399 | orchestrator | 2025-09-27 00:44:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:16.570280 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:16.570375 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:44:16.570389 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task 8fa36388-2203-4150-93bd-9847f61bae4b is in state SUCCESS 2025-09-27 00:44:16.570400 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:16.570411 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:16.570422 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:16.570433 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:16.570444 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:16.570454 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:16.570465 | orchestrator | 2025-09-27 00:44:16 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:16.570476 | orchestrator | 2025-09-27 00:44:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:19.670003 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:19.670137 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:44:19.670179 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:19.670192 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:19.670203 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:19.670266 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:19.670277 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:19.670288 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:19.670299 | orchestrator | 2025-09-27 00:44:19 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:19.670310 | orchestrator | 2025-09-27 00:44:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:22.735828 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:22.735925 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state STARTED 2025-09-27 00:44:22.735940 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:22.735953 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:22.735964 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:22.735975 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:22.735986 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:22.735997 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:22.736007 | orchestrator | 2025-09-27 00:44:22 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:22.736018 | orchestrator | 2025-09-27 00:44:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:25.787581 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:25.788669 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task a6a10e8b-f7b1-4f45-a12b-29ae21a864dc is in state SUCCESS 2025-09-27 00:44:25.789931 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:25.803661 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:25.803690 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:25.803701 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:25.803713 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:25.803723 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:25.803734 | orchestrator | 2025-09-27 00:44:25 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:25.803769 | orchestrator | 2025-09-27 00:44:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:28.860106 | orchestrator | 2025-09-27 00:44:28 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:28.860201 | orchestrator | 2025-09-27 00:44:28 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:28.860266 | orchestrator | 2025-09-27 00:44:28 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:28.860278 | orchestrator | 2025-09-27 00:44:28 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:28.860289 | orchestrator | 2025-09-27 00:44:28 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:28.860300 | orchestrator | 2025-09-27 00:44:28 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:28.860311 | orchestrator | 2025-09-27 00:44:28 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:28.860322 | orchestrator | 2025-09-27 00:44:28 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:28.860333 | orchestrator | 2025-09-27 00:44:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:32.174367 | orchestrator | 2025-09-27 00:44:32 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:32.174457 | orchestrator | 2025-09-27 00:44:32 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:32.174470 | orchestrator | 2025-09-27 00:44:32 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:32.174481 | orchestrator | 2025-09-27 00:44:32 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:32.174491 | orchestrator | 2025-09-27 00:44:32 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:32.174502 | orchestrator | 2025-09-27 00:44:32 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:32.174513 | orchestrator | 2025-09-27 00:44:32 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:32.174523 | orchestrator | 2025-09-27 00:44:32 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:32.174534 | orchestrator | 2025-09-27 00:44:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:35.365536 | orchestrator | 2025-09-27 00:44:35 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:35.365791 | orchestrator | 2025-09-27 00:44:35 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:35.365826 | orchestrator | 2025-09-27 00:44:35 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:35.366349 | orchestrator | 2025-09-27 00:44:35 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:35.366786 | orchestrator | 2025-09-27 00:44:35 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:35.367328 | orchestrator | 2025-09-27 00:44:35 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:35.367802 | orchestrator | 2025-09-27 00:44:35 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:35.368465 | orchestrator | 2025-09-27 00:44:35 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:35.368492 | orchestrator | 2025-09-27 00:44:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:38.476100 | orchestrator | 2025-09-27 00:44:38 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:38.478265 | orchestrator | 2025-09-27 00:44:38 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:38.479397 | orchestrator | 2025-09-27 00:44:38 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:38.480566 | orchestrator | 2025-09-27 00:44:38 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:38.481580 | orchestrator | 2025-09-27 00:44:38 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:38.482729 | orchestrator | 2025-09-27 00:44:38 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:38.483833 | orchestrator | 2025-09-27 00:44:38 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:38.484992 | orchestrator | 2025-09-27 00:44:38 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:38.485018 | orchestrator | 2025-09-27 00:44:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:41.900675 | orchestrator | 2025-09-27 00:44:41 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:41.900774 | orchestrator | 2025-09-27 00:44:41 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:41.903936 | orchestrator | 2025-09-27 00:44:41 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:41.905459 | orchestrator | 2025-09-27 00:44:41 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:41.905522 | orchestrator | 2025-09-27 00:44:41 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:41.907545 | orchestrator | 2025-09-27 00:44:41 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:41.908014 | orchestrator | 2025-09-27 00:44:41 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state STARTED 2025-09-27 00:44:41.909422 | orchestrator | 2025-09-27 00:44:41 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:41.909448 | orchestrator | 2025-09-27 00:44:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:44.963739 | orchestrator | 2025-09-27 00:44:44 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:44.963953 | orchestrator | 2025-09-27 00:44:44 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:44.964900 | orchestrator | 2025-09-27 00:44:44 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:44.965970 | orchestrator | 2025-09-27 00:44:44 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state STARTED 2025-09-27 00:44:44.965996 | orchestrator | 2025-09-27 00:44:44 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:44.970829 | orchestrator | 2025-09-27 00:44:44 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:44.970853 | orchestrator | 2025-09-27 00:44:44 | INFO  | Task 16ceff3c-2063-4187-b9f0-c442aab0be5b is in state SUCCESS 2025-09-27 00:44:44.970865 | orchestrator | 2025-09-27 00:44:44 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state STARTED 2025-09-27 00:44:44.971004 | orchestrator | 2025-09-27 00:44:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:44.971595 | orchestrator | 2025-09-27 00:44:44.971617 | orchestrator | 2025-09-27 00:44:44.971629 | orchestrator | PLAY [Apply role homer] ******************************************************** 2025-09-27 00:44:44.971668 | orchestrator | 2025-09-27 00:44:44.971680 | orchestrator | TASK [osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards] *** 2025-09-27 00:44:44.971692 | orchestrator | Saturday 27 September 2025 00:43:37 +0000 (0:00:00.459) 0:00:00.460 **** 2025-09-27 00:44:44.971704 | orchestrator | ok: [testbed-manager] => { 2025-09-27 00:44:44.971717 | orchestrator |  "msg": "The support for the homer_url_kibana has been removed. Please use the homer_url_opensearch_dashboards parameter." 2025-09-27 00:44:44.971730 | orchestrator | } 2025-09-27 00:44:44.971742 | orchestrator | 2025-09-27 00:44:44.971753 | orchestrator | TASK [osism.services.homer : Create traefik external network] ****************** 2025-09-27 00:44:44.971764 | orchestrator | Saturday 27 September 2025 00:43:38 +0000 (0:00:00.309) 0:00:00.769 **** 2025-09-27 00:44:44.971775 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.971789 | orchestrator | 2025-09-27 00:44:44.971800 | orchestrator | TASK [osism.services.homer : Create required directories] ********************** 2025-09-27 00:44:44.971812 | orchestrator | Saturday 27 September 2025 00:43:39 +0000 (0:00:01.120) 0:00:01.890 **** 2025-09-27 00:44:44.971823 | orchestrator | changed: [testbed-manager] => (item=/opt/homer/configuration) 2025-09-27 00:44:44.971834 | orchestrator | ok: [testbed-manager] => (item=/opt/homer) 2025-09-27 00:44:44.971845 | orchestrator | 2025-09-27 00:44:44.971857 | orchestrator | TASK [osism.services.homer : Copy config.yml configuration file] *************** 2025-09-27 00:44:44.971868 | orchestrator | Saturday 27 September 2025 00:43:40 +0000 (0:00:01.484) 0:00:03.375 **** 2025-09-27 00:44:44.971879 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.971891 | orchestrator | 2025-09-27 00:44:44.971902 | orchestrator | TASK [osism.services.homer : Copy docker-compose.yml file] ********************* 2025-09-27 00:44:44.971913 | orchestrator | Saturday 27 September 2025 00:43:43 +0000 (0:00:03.259) 0:00:06.634 **** 2025-09-27 00:44:44.971924 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.971936 | orchestrator | 2025-09-27 00:44:44.971947 | orchestrator | TASK [osism.services.homer : Manage homer service] ***************************** 2025-09-27 00:44:44.971976 | orchestrator | Saturday 27 September 2025 00:43:45 +0000 (0:00:01.986) 0:00:08.621 **** 2025-09-27 00:44:44.971987 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage homer service (10 retries left). 2025-09-27 00:44:44.971999 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.972010 | orchestrator | 2025-09-27 00:44:44.972021 | orchestrator | RUNNING HANDLER [osism.services.homer : Restart homer service] ***************** 2025-09-27 00:44:44.972032 | orchestrator | Saturday 27 September 2025 00:44:10 +0000 (0:00:24.556) 0:00:33.177 **** 2025-09-27 00:44:44.972043 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.972054 | orchestrator | 2025-09-27 00:44:44.972065 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:44:44.972077 | orchestrator | testbed-manager : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.972090 | orchestrator | 2025-09-27 00:44:44.972101 | orchestrator | 2025-09-27 00:44:44.972112 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:44:44.972123 | orchestrator | Saturday 27 September 2025 00:44:13 +0000 (0:00:02.666) 0:00:35.844 **** 2025-09-27 00:44:44.972134 | orchestrator | =============================================================================== 2025-09-27 00:44:44.972146 | orchestrator | osism.services.homer : Manage homer service ---------------------------- 24.56s 2025-09-27 00:44:44.972157 | orchestrator | osism.services.homer : Copy config.yml configuration file --------------- 3.25s 2025-09-27 00:44:44.972168 | orchestrator | osism.services.homer : Restart homer service ---------------------------- 2.67s 2025-09-27 00:44:44.972179 | orchestrator | osism.services.homer : Copy docker-compose.yml file --------------------- 1.99s 2025-09-27 00:44:44.972190 | orchestrator | osism.services.homer : Create required directories ---------------------- 1.49s 2025-09-27 00:44:44.972202 | orchestrator | osism.services.homer : Create traefik external network ------------------ 1.12s 2025-09-27 00:44:44.972270 | orchestrator | osism.services.homer : Inform about new parameter homer_url_opensearch_dashboards --- 0.31s 2025-09-27 00:44:44.972293 | orchestrator | 2025-09-27 00:44:44.972306 | orchestrator | 2025-09-27 00:44:44.972318 | orchestrator | PLAY [Apply role openstackclient] ********************************************** 2025-09-27 00:44:44.972330 | orchestrator | 2025-09-27 00:44:44.972343 | orchestrator | TASK [osism.services.openstackclient : Include tasks] ************************** 2025-09-27 00:44:44.972355 | orchestrator | Saturday 27 September 2025 00:43:37 +0000 (0:00:00.289) 0:00:00.289 **** 2025-09-27 00:44:44.972369 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/openstackclient/tasks/container-Debian-family.yml for testbed-manager 2025-09-27 00:44:44.972382 | orchestrator | 2025-09-27 00:44:44.972394 | orchestrator | TASK [osism.services.openstackclient : Create required directories] ************ 2025-09-27 00:44:44.972407 | orchestrator | Saturday 27 September 2025 00:43:38 +0000 (0:00:00.353) 0:00:00.643 **** 2025-09-27 00:44:44.972419 | orchestrator | changed: [testbed-manager] => (item=/opt/configuration/environments/openstack) 2025-09-27 00:44:44.972431 | orchestrator | changed: [testbed-manager] => (item=/opt/openstackclient/data) 2025-09-27 00:44:44.972443 | orchestrator | ok: [testbed-manager] => (item=/opt/openstackclient) 2025-09-27 00:44:44.972455 | orchestrator | 2025-09-27 00:44:44.972467 | orchestrator | TASK [osism.services.openstackclient : Copy docker-compose.yml file] *********** 2025-09-27 00:44:44.972480 | orchestrator | Saturday 27 September 2025 00:43:40 +0000 (0:00:02.652) 0:00:03.295 **** 2025-09-27 00:44:44.972493 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.972505 | orchestrator | 2025-09-27 00:44:44.972518 | orchestrator | TASK [osism.services.openstackclient : Manage openstackclient service] ********* 2025-09-27 00:44:44.972530 | orchestrator | Saturday 27 September 2025 00:43:43 +0000 (0:00:02.598) 0:00:05.893 **** 2025-09-27 00:44:44.972554 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage openstackclient service (10 retries left). 2025-09-27 00:44:44.972566 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.972577 | orchestrator | 2025-09-27 00:44:44.972587 | orchestrator | TASK [osism.services.openstackclient : Copy openstack wrapper script] ********** 2025-09-27 00:44:44.972598 | orchestrator | Saturday 27 September 2025 00:44:15 +0000 (0:00:32.171) 0:00:38.065 **** 2025-09-27 00:44:44.972609 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.972619 | orchestrator | 2025-09-27 00:44:44.972630 | orchestrator | TASK [osism.services.openstackclient : Remove ospurge wrapper script] ********** 2025-09-27 00:44:44.972641 | orchestrator | Saturday 27 September 2025 00:44:16 +0000 (0:00:00.811) 0:00:38.877 **** 2025-09-27 00:44:44.972651 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.972662 | orchestrator | 2025-09-27 00:44:44.972673 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Restart openstackclient service] *** 2025-09-27 00:44:44.972683 | orchestrator | Saturday 27 September 2025 00:44:17 +0000 (0:00:00.881) 0:00:39.758 **** 2025-09-27 00:44:44.972694 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.972705 | orchestrator | 2025-09-27 00:44:44.972715 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Ensure that all containers are up] *** 2025-09-27 00:44:44.972726 | orchestrator | Saturday 27 September 2025 00:44:20 +0000 (0:00:02.909) 0:00:42.668 **** 2025-09-27 00:44:44.972737 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.972747 | orchestrator | 2025-09-27 00:44:44.972758 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Wait for an healthy service] *** 2025-09-27 00:44:44.972775 | orchestrator | Saturday 27 September 2025 00:44:21 +0000 (0:00:01.201) 0:00:43.869 **** 2025-09-27 00:44:44.972786 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.972797 | orchestrator | 2025-09-27 00:44:44.972807 | orchestrator | RUNNING HANDLER [osism.services.openstackclient : Copy bash completion script] *** 2025-09-27 00:44:44.972818 | orchestrator | Saturday 27 September 2025 00:44:22 +0000 (0:00:00.834) 0:00:44.703 **** 2025-09-27 00:44:44.972828 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.972839 | orchestrator | 2025-09-27 00:44:44.972850 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:44:44.972861 | orchestrator | testbed-manager : ok=10  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.972880 | orchestrator | 2025-09-27 00:44:44.972891 | orchestrator | 2025-09-27 00:44:44.972901 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:44:44.972912 | orchestrator | Saturday 27 September 2025 00:44:22 +0000 (0:00:00.705) 0:00:45.408 **** 2025-09-27 00:44:44.972923 | orchestrator | =============================================================================== 2025-09-27 00:44:44.972933 | orchestrator | osism.services.openstackclient : Manage openstackclient service -------- 32.17s 2025-09-27 00:44:44.972944 | orchestrator | osism.services.openstackclient : Restart openstackclient service -------- 2.91s 2025-09-27 00:44:44.972955 | orchestrator | osism.services.openstackclient : Create required directories ------------ 2.65s 2025-09-27 00:44:44.972965 | orchestrator | osism.services.openstackclient : Copy docker-compose.yml file ----------- 2.60s 2025-09-27 00:44:44.972976 | orchestrator | osism.services.openstackclient : Ensure that all containers are up ------ 1.20s 2025-09-27 00:44:44.972987 | orchestrator | osism.services.openstackclient : Remove ospurge wrapper script ---------- 0.88s 2025-09-27 00:44:44.972997 | orchestrator | osism.services.openstackclient : Wait for an healthy service ------------ 0.83s 2025-09-27 00:44:44.973008 | orchestrator | osism.services.openstackclient : Copy openstack wrapper script ---------- 0.81s 2025-09-27 00:44:44.973019 | orchestrator | osism.services.openstackclient : Copy bash completion script ------------ 0.71s 2025-09-27 00:44:44.973029 | orchestrator | osism.services.openstackclient : Include tasks -------------------------- 0.35s 2025-09-27 00:44:44.973040 | orchestrator | 2025-09-27 00:44:44.973050 | orchestrator | 2025-09-27 00:44:44.973061 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:44:44.973072 | orchestrator | 2025-09-27 00:44:44.973082 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:44:44.973093 | orchestrator | Saturday 27 September 2025 00:43:38 +0000 (0:00:00.375) 0:00:00.375 **** 2025-09-27 00:44:44.973104 | orchestrator | changed: [testbed-manager] => (item=enable_netdata_True) 2025-09-27 00:44:44.973114 | orchestrator | changed: [testbed-node-0] => (item=enable_netdata_True) 2025-09-27 00:44:44.973125 | orchestrator | changed: [testbed-node-1] => (item=enable_netdata_True) 2025-09-27 00:44:44.973135 | orchestrator | changed: [testbed-node-2] => (item=enable_netdata_True) 2025-09-27 00:44:44.973146 | orchestrator | changed: [testbed-node-3] => (item=enable_netdata_True) 2025-09-27 00:44:44.973156 | orchestrator | changed: [testbed-node-4] => (item=enable_netdata_True) 2025-09-27 00:44:44.973167 | orchestrator | changed: [testbed-node-5] => (item=enable_netdata_True) 2025-09-27 00:44:44.973178 | orchestrator | 2025-09-27 00:44:44.973188 | orchestrator | PLAY [Apply role netdata] ****************************************************** 2025-09-27 00:44:44.973199 | orchestrator | 2025-09-27 00:44:44.973227 | orchestrator | TASK [osism.services.netdata : Include distribution specific install tasks] **** 2025-09-27 00:44:44.973238 | orchestrator | Saturday 27 September 2025 00:43:39 +0000 (0:00:01.012) 0:00:01.387 **** 2025-09-27 00:44:44.973251 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/install-Debian-family.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-4, testbed-node-3, testbed-node-5 2025-09-27 00:44:44.973264 | orchestrator | 2025-09-27 00:44:44.973275 | orchestrator | TASK [osism.services.netdata : Remove old architecture-dependent repository] *** 2025-09-27 00:44:44.973286 | orchestrator | Saturday 27 September 2025 00:43:41 +0000 (0:00:01.343) 0:00:02.731 **** 2025-09-27 00:44:44.973297 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:44:44.973308 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:44:44.973334 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:44:44.973344 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:44:44.973355 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:44:44.973373 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:44:44.973384 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.973395 | orchestrator | 2025-09-27 00:44:44.973413 | orchestrator | TASK [osism.services.netdata : Install apt-transport-https package] ************ 2025-09-27 00:44:44.973424 | orchestrator | Saturday 27 September 2025 00:43:43 +0000 (0:00:01.713) 0:00:04.445 **** 2025-09-27 00:44:44.973435 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:44:44.973445 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:44:44.973456 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.973467 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:44:44.973477 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:44:44.973488 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:44:44.973498 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:44:44.973509 | orchestrator | 2025-09-27 00:44:44.973520 | orchestrator | TASK [osism.services.netdata : Add repository gpg key] ************************* 2025-09-27 00:44:44.973530 | orchestrator | Saturday 27 September 2025 00:43:47 +0000 (0:00:03.987) 0:00:08.432 **** 2025-09-27 00:44:44.973541 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.973552 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:44:44.973562 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:44:44.973573 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:44:44.973584 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:44:44.973594 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:44:44.973605 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:44:44.973615 | orchestrator | 2025-09-27 00:44:44.973626 | orchestrator | TASK [osism.services.netdata : Add repository] ********************************* 2025-09-27 00:44:44.973642 | orchestrator | Saturday 27 September 2025 00:43:48 +0000 (0:00:01.643) 0:00:10.075 **** 2025-09-27 00:44:44.973653 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:44:44.973663 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:44:44.973674 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:44:44.973684 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:44:44.973695 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:44:44.973705 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:44:44.973716 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.973727 | orchestrator | 2025-09-27 00:44:44.973737 | orchestrator | TASK [osism.services.netdata : Install package netdata] ************************ 2025-09-27 00:44:44.973748 | orchestrator | Saturday 27 September 2025 00:44:00 +0000 (0:00:11.740) 0:00:21.815 **** 2025-09-27 00:44:44.973759 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:44:44.973769 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:44:44.973780 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:44:44.973791 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:44:44.973801 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:44:44.973812 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:44:44.973823 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.973833 | orchestrator | 2025-09-27 00:44:44.973844 | orchestrator | TASK [osism.services.netdata : Include config tasks] *************************** 2025-09-27 00:44:44.973855 | orchestrator | Saturday 27 September 2025 00:44:23 +0000 (0:00:23.411) 0:00:45.227 **** 2025-09-27 00:44:44.973866 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/config.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:44:44.973878 | orchestrator | 2025-09-27 00:44:44.973889 | orchestrator | TASK [osism.services.netdata : Copy configuration files] *********************** 2025-09-27 00:44:44.973900 | orchestrator | Saturday 27 September 2025 00:44:25 +0000 (0:00:01.209) 0:00:46.437 **** 2025-09-27 00:44:44.973910 | orchestrator | changed: [testbed-node-0] => (item=netdata.conf) 2025-09-27 00:44:44.973921 | orchestrator | changed: [testbed-manager] => (item=netdata.conf) 2025-09-27 00:44:44.973932 | orchestrator | changed: [testbed-node-1] => (item=netdata.conf) 2025-09-27 00:44:44.973943 | orchestrator | changed: [testbed-node-4] => (item=netdata.conf) 2025-09-27 00:44:44.973954 | orchestrator | changed: [testbed-node-2] => (item=netdata.conf) 2025-09-27 00:44:44.973964 | orchestrator | changed: [testbed-node-5] => (item=netdata.conf) 2025-09-27 00:44:44.973975 | orchestrator | changed: [testbed-node-3] => (item=netdata.conf) 2025-09-27 00:44:44.973992 | orchestrator | changed: [testbed-node-0] => (item=stream.conf) 2025-09-27 00:44:44.974002 | orchestrator | changed: [testbed-node-4] => (item=stream.conf) 2025-09-27 00:44:44.974013 | orchestrator | changed: [testbed-node-1] => (item=stream.conf) 2025-09-27 00:44:44.974070 | orchestrator | changed: [testbed-node-2] => (item=stream.conf) 2025-09-27 00:44:44.974081 | orchestrator | changed: [testbed-manager] => (item=stream.conf) 2025-09-27 00:44:44.974092 | orchestrator | changed: [testbed-node-5] => (item=stream.conf) 2025-09-27 00:44:44.974103 | orchestrator | changed: [testbed-node-3] => (item=stream.conf) 2025-09-27 00:44:44.974113 | orchestrator | 2025-09-27 00:44:44.974124 | orchestrator | TASK [osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status] *** 2025-09-27 00:44:44.974135 | orchestrator | Saturday 27 September 2025 00:44:29 +0000 (0:00:04.427) 0:00:50.864 **** 2025-09-27 00:44:44.974146 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:44:44.974157 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.974167 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:44:44.974178 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:44:44.974189 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:44:44.974199 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:44:44.974229 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:44:44.974240 | orchestrator | 2025-09-27 00:44:44.974250 | orchestrator | TASK [osism.services.netdata : Opt out from anonymous statistics] ************** 2025-09-27 00:44:44.974261 | orchestrator | Saturday 27 September 2025 00:44:30 +0000 (0:00:00.987) 0:00:51.851 **** 2025-09-27 00:44:44.974272 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:44:44.974282 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:44:44.974293 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.974303 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:44:44.974314 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:44:44.974324 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:44:44.974335 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:44:44.974345 | orchestrator | 2025-09-27 00:44:44.974356 | orchestrator | TASK [osism.services.netdata : Add netdata user to docker group] *************** 2025-09-27 00:44:44.974374 | orchestrator | Saturday 27 September 2025 00:44:31 +0000 (0:00:01.518) 0:00:53.369 **** 2025-09-27 00:44:44.974385 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.974395 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:44:44.974406 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:44:44.974417 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:44:44.974427 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:44:44.974438 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:44:44.974449 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:44:44.974459 | orchestrator | 2025-09-27 00:44:44.974470 | orchestrator | TASK [osism.services.netdata : Manage service netdata] ************************* 2025-09-27 00:44:44.974481 | orchestrator | Saturday 27 September 2025 00:44:33 +0000 (0:00:01.639) 0:00:55.009 **** 2025-09-27 00:44:44.974492 | orchestrator | ok: [testbed-manager] 2025-09-27 00:44:44.974502 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:44:44.974513 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:44:44.974523 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:44:44.974534 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:44:44.974544 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:44:44.974555 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:44:44.974565 | orchestrator | 2025-09-27 00:44:44.974576 | orchestrator | TASK [osism.services.netdata : Include host type specific tasks] *************** 2025-09-27 00:44:44.974587 | orchestrator | Saturday 27 September 2025 00:44:35 +0000 (0:00:02.212) 0:00:57.221 **** 2025-09-27 00:44:44.974597 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/server.yml for testbed-manager 2025-09-27 00:44:44.974615 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/netdata/tasks/client.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:44:44.974634 | orchestrator | 2025-09-27 00:44:44.974645 | orchestrator | TASK [osism.services.netdata : Set sysctl vm.max_map_count parameter] ********** 2025-09-27 00:44:44.974656 | orchestrator | Saturday 27 September 2025 00:44:37 +0000 (0:00:01.334) 0:00:58.555 **** 2025-09-27 00:44:44.974666 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.974677 | orchestrator | 2025-09-27 00:44:44.974688 | orchestrator | RUNNING HANDLER [osism.services.netdata : Restart service netdata] ************* 2025-09-27 00:44:44.974698 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:01.620) 0:01:00.176 **** 2025-09-27 00:44:44.974709 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:44:44.974720 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:44:44.974730 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:44:44.974741 | orchestrator | changed: [testbed-manager] 2025-09-27 00:44:44.974751 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:44:44.974762 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:44:44.974773 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:44:44.974783 | orchestrator | 2025-09-27 00:44:44.974794 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:44:44.974805 | orchestrator | testbed-manager : ok=16  changed=8  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.974816 | orchestrator | testbed-node-0 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.974827 | orchestrator | testbed-node-1 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.974838 | orchestrator | testbed-node-2 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.974848 | orchestrator | testbed-node-3 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.974859 | orchestrator | testbed-node-4 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.974870 | orchestrator | testbed-node-5 : ok=15  changed=7  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:44.974881 | orchestrator | 2025-09-27 00:44:44.974891 | orchestrator | 2025-09-27 00:44:44.974902 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:44:44.974913 | orchestrator | Saturday 27 September 2025 00:44:42 +0000 (0:00:03.490) 0:01:03.667 **** 2025-09-27 00:44:44.974923 | orchestrator | =============================================================================== 2025-09-27 00:44:44.974934 | orchestrator | osism.services.netdata : Install package netdata ----------------------- 23.41s 2025-09-27 00:44:44.974944 | orchestrator | osism.services.netdata : Add repository -------------------------------- 11.74s 2025-09-27 00:44:44.974955 | orchestrator | osism.services.netdata : Copy configuration files ----------------------- 4.43s 2025-09-27 00:44:44.974966 | orchestrator | osism.services.netdata : Install apt-transport-https package ------------ 3.99s 2025-09-27 00:44:44.974976 | orchestrator | osism.services.netdata : Restart service netdata ------------------------ 3.49s 2025-09-27 00:44:44.974987 | orchestrator | osism.services.netdata : Manage service netdata ------------------------- 2.21s 2025-09-27 00:44:44.974998 | orchestrator | osism.services.netdata : Remove old architecture-dependent repository --- 1.71s 2025-09-27 00:44:44.975008 | orchestrator | osism.services.netdata : Add repository gpg key ------------------------- 1.64s 2025-09-27 00:44:44.975019 | orchestrator | osism.services.netdata : Add netdata user to docker group --------------- 1.64s 2025-09-27 00:44:44.975029 | orchestrator | osism.services.netdata : Set sysctl vm.max_map_count parameter ---------- 1.62s 2025-09-27 00:44:44.975040 | orchestrator | osism.services.netdata : Opt out from anonymous statistics -------------- 1.52s 2025-09-27 00:44:44.975056 | orchestrator | osism.services.netdata : Include distribution specific install tasks ---- 1.34s 2025-09-27 00:44:44.975073 | orchestrator | osism.services.netdata : Include host type specific tasks --------------- 1.33s 2025-09-27 00:44:44.975084 | orchestrator | osism.services.netdata : Include config tasks --------------------------- 1.21s 2025-09-27 00:44:44.975095 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.01s 2025-09-27 00:44:44.975106 | orchestrator | osism.services.netdata : Retrieve /etc/netdata/.opt-out-from-anonymous-statistics status --- 0.99s 2025-09-27 00:44:48.028587 | orchestrator | 2025-09-27 00:44:48 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:48.030124 | orchestrator | 2025-09-27 00:44:48 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:48.031679 | orchestrator | 2025-09-27 00:44:48 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:48.033253 | orchestrator | 2025-09-27 00:44:48 | INFO  | Task 5b160b40-3f10-4cee-93e8-cb0339f4d343 is in state SUCCESS 2025-09-27 00:44:48.034925 | orchestrator | 2025-09-27 00:44:48.034967 | orchestrator | 2025-09-27 00:44:48.034978 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:44:48.034988 | orchestrator | 2025-09-27 00:44:48.035006 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:44:48.035017 | orchestrator | Saturday 27 September 2025 00:44:06 +0000 (0:00:00.420) 0:00:00.421 **** 2025-09-27 00:44:48.035026 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:44:48.035037 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:44:48.035047 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:44:48.035056 | orchestrator | 2025-09-27 00:44:48.035066 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:44:48.035076 | orchestrator | Saturday 27 September 2025 00:44:07 +0000 (0:00:01.151) 0:00:01.572 **** 2025-09-27 00:44:48.035085 | orchestrator | ok: [testbed-node-0] => (item=enable_redis_True) 2025-09-27 00:44:48.035095 | orchestrator | ok: [testbed-node-1] => (item=enable_redis_True) 2025-09-27 00:44:48.035105 | orchestrator | ok: [testbed-node-2] => (item=enable_redis_True) 2025-09-27 00:44:48.035114 | orchestrator | 2025-09-27 00:44:48.035124 | orchestrator | PLAY [Apply role redis] ******************************************************** 2025-09-27 00:44:48.035133 | orchestrator | 2025-09-27 00:44:48.035143 | orchestrator | TASK [redis : include_tasks] *************************************************** 2025-09-27 00:44:48.035152 | orchestrator | Saturday 27 September 2025 00:44:09 +0000 (0:00:01.559) 0:00:03.131 **** 2025-09-27 00:44:48.035162 | orchestrator | included: /ansible/roles/redis/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:44:48.035172 | orchestrator | 2025-09-27 00:44:48.035182 | orchestrator | TASK [redis : Ensuring config directories exist] ******************************* 2025-09-27 00:44:48.035191 | orchestrator | Saturday 27 September 2025 00:44:11 +0000 (0:00:01.934) 0:00:05.066 **** 2025-09-27 00:44:48.035204 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035245 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035274 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035286 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035310 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035325 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035335 | orchestrator | 2025-09-27 00:44:48.035345 | orchestrator | TASK [redis : Copying over default config.json files] ************************** 2025-09-27 00:44:48.035355 | orchestrator | Saturday 27 September 2025 00:44:13 +0000 (0:00:02.645) 0:00:07.711 **** 2025-09-27 00:44:48.035365 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035375 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035392 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035402 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035413 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035433 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035444 | orchestrator | 2025-09-27 00:44:48.035454 | orchestrator | TASK [redis : Copying over redis config files] ********************************* 2025-09-27 00:44:48.035464 | orchestrator | Saturday 27 September 2025 00:44:17 +0000 (0:00:04.294) 0:00:12.006 **** 2025-09-27 00:44:48.035474 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035484 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035499 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035509 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035520 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035540 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035550 | orchestrator | 2025-09-27 00:44:48.035560 | orchestrator | TASK [redis : Check redis containers] ****************************************** 2025-09-27 00:44:48.035570 | orchestrator | Saturday 27 September 2025 00:44:23 +0000 (0:00:05.395) 0:00:17.401 **** 2025-09-27 00:44:48.035580 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035590 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035606 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis', 'value': {'container_name': 'redis', 'group': 'redis', 'enabled': True, 'image': 'registry.osism.tech/kolla/redis:2024.2', 'volumes': ['/etc/kolla/redis/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'redis:/var/lib/redis/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-server 6379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035616 | orchestrator | changed: [testbed-node-1] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035626 | orchestrator | changed: [testbed-node-2] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035643 | orchestrator | changed: [testbed-node-0] => (item={'key': 'redis-sentinel', 'value': {'container_name': 'redis_sentinel', 'group': 'redis', 'environment': {'REDIS_CONF': '/etc/redis/redis.conf', 'REDIS_GEN_CONF': '/etc/redis/redis-regenerated-by-config-rewrite.conf'}, 'enabled': True, 'image': 'registry.osism.tech/kolla/redis-sentinel:2024.2', 'volumes': ['/etc/kolla/redis-sentinel/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen redis-sentinel 26379'], 'timeout': '30'}}}) 2025-09-27 00:44:48.035653 | orchestrator | 2025-09-27 00:44:48.035663 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-09-27 00:44:48.035673 | orchestrator | Saturday 27 September 2025 00:44:25 +0000 (0:00:02.196) 0:00:19.598 **** 2025-09-27 00:44:48.035682 | orchestrator | 2025-09-27 00:44:48.035692 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-09-27 00:44:48.035702 | orchestrator | Saturday 27 September 2025 00:44:25 +0000 (0:00:00.174) 0:00:19.773 **** 2025-09-27 00:44:48.035711 | orchestrator | 2025-09-27 00:44:48.035721 | orchestrator | TASK [redis : Flush handlers] ************************************************** 2025-09-27 00:44:48.035731 | orchestrator | Saturday 27 September 2025 00:44:25 +0000 (0:00:00.193) 0:00:19.966 **** 2025-09-27 00:44:48.035740 | orchestrator | 2025-09-27 00:44:48.035750 | orchestrator | RUNNING HANDLER [redis : Restart redis container] ****************************** 2025-09-27 00:44:48.035759 | orchestrator | Saturday 27 September 2025 00:44:26 +0000 (0:00:00.107) 0:00:20.073 **** 2025-09-27 00:44:48.035769 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:44:48.035779 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:44:48.035797 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:44:48.035806 | orchestrator | 2025-09-27 00:44:48.035816 | orchestrator | RUNNING HANDLER [redis : Restart redis-sentinel container] ********************* 2025-09-27 00:44:48.035826 | orchestrator | Saturday 27 September 2025 00:44:44 +0000 (0:00:18.127) 0:00:38.201 **** 2025-09-27 00:44:48.035835 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:44:48.035845 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:44:48.035854 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:44:48.035864 | orchestrator | 2025-09-27 00:44:48.035873 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:44:48.035883 | orchestrator | testbed-node-0 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:48.035894 | orchestrator | testbed-node-1 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:48.035903 | orchestrator | testbed-node-2 : ok=9  changed=6  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:44:48.035913 | orchestrator | 2025-09-27 00:44:48.035922 | orchestrator | 2025-09-27 00:44:48.035932 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:44:48.035942 | orchestrator | Saturday 27 September 2025 00:44:47 +0000 (0:00:03.636) 0:00:41.838 **** 2025-09-27 00:44:48.035951 | orchestrator | =============================================================================== 2025-09-27 00:44:48.035961 | orchestrator | redis : Restart redis container ---------------------------------------- 18.13s 2025-09-27 00:44:48.035970 | orchestrator | redis : Copying over redis config files --------------------------------- 5.43s 2025-09-27 00:44:48.035980 | orchestrator | redis : Copying over default config.json files -------------------------- 4.29s 2025-09-27 00:44:48.035989 | orchestrator | redis : Restart redis-sentinel container -------------------------------- 3.64s 2025-09-27 00:44:48.035999 | orchestrator | redis : Ensuring config directories exist ------------------------------- 2.65s 2025-09-27 00:44:48.036008 | orchestrator | redis : Check redis containers ------------------------------------------ 2.16s 2025-09-27 00:44:48.036018 | orchestrator | redis : include_tasks --------------------------------------------------- 1.93s 2025-09-27 00:44:48.036033 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.56s 2025-09-27 00:44:48.036043 | orchestrator | Group hosts based on Kolla action --------------------------------------- 1.15s 2025-09-27 00:44:48.036052 | orchestrator | redis : Flush handlers -------------------------------------------------- 0.48s 2025-09-27 00:44:48.036062 | orchestrator | 2025-09-27 00:44:48 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:48.038750 | orchestrator | 2025-09-27 00:44:48 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:48.038776 | orchestrator | 2025-09-27 00:44:48 | INFO  | Task 0dbfe98d-624d-4af0-b069-0ef2b2ddc90e is in state SUCCESS 2025-09-27 00:44:48.039755 | orchestrator | 2025-09-27 00:44:48 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:44:48.039777 | orchestrator | 2025-09-27 00:44:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:51.084457 | orchestrator | 2025-09-27 00:44:51 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:51.085812 | orchestrator | 2025-09-27 00:44:51 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state STARTED 2025-09-27 00:44:51.087311 | orchestrator | 2025-09-27 00:44:51 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:51.088750 | orchestrator | 2025-09-27 00:44:51 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:51.090304 | orchestrator | 2025-09-27 00:44:51 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:51.091551 | orchestrator | 2025-09-27 00:44:51 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:44:51.091575 | orchestrator | 2025-09-27 00:44:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:54.135798 | orchestrator | 2025-09-27 00:44:54 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:54.136404 | orchestrator | 2025-09-27 00:44:54 | INFO  | Task 6738d185-a1b4-49ce-965d-88758fa6634a is in state SUCCESS 2025-09-27 00:44:54.136889 | orchestrator | 2025-09-27 00:44:54 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:54.137493 | orchestrator | 2025-09-27 00:44:54 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:54.138089 | orchestrator | 2025-09-27 00:44:54 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:54.138809 | orchestrator | 2025-09-27 00:44:54 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:44:54.138836 | orchestrator | 2025-09-27 00:44:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:44:57.169019 | orchestrator | 2025-09-27 00:44:57 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:44:57.169976 | orchestrator | 2025-09-27 00:44:57 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:44:57.171376 | orchestrator | 2025-09-27 00:44:57 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:44:57.173335 | orchestrator | 2025-09-27 00:44:57 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:44:57.175286 | orchestrator | 2025-09-27 00:44:57 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:44:57.175421 | orchestrator | 2025-09-27 00:44:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:00.212320 | orchestrator | 2025-09-27 00:45:00 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:45:00.214666 | orchestrator | 2025-09-27 00:45:00 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:00.215572 | orchestrator | 2025-09-27 00:45:00 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:00.216438 | orchestrator | 2025-09-27 00:45:00 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:00.218002 | orchestrator | 2025-09-27 00:45:00 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:00.218337 | orchestrator | 2025-09-27 00:45:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:03.248903 | orchestrator | 2025-09-27 00:45:03 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:45:03.250090 | orchestrator | 2025-09-27 00:45:03 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:03.250661 | orchestrator | 2025-09-27 00:45:03 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:03.252694 | orchestrator | 2025-09-27 00:45:03 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:03.253745 | orchestrator | 2025-09-27 00:45:03 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:03.253795 | orchestrator | 2025-09-27 00:45:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:06.287552 | orchestrator | 2025-09-27 00:45:06 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:45:06.288076 | orchestrator | 2025-09-27 00:45:06 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:06.289528 | orchestrator | 2025-09-27 00:45:06 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:06.290998 | orchestrator | 2025-09-27 00:45:06 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:06.292284 | orchestrator | 2025-09-27 00:45:06 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:06.292308 | orchestrator | 2025-09-27 00:45:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:09.330891 | orchestrator | 2025-09-27 00:45:09 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:45:09.331768 | orchestrator | 2025-09-27 00:45:09 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:09.332761 | orchestrator | 2025-09-27 00:45:09 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:09.333945 | orchestrator | 2025-09-27 00:45:09 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:09.334403 | orchestrator | 2025-09-27 00:45:09 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:09.334433 | orchestrator | 2025-09-27 00:45:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:12.362532 | orchestrator | 2025-09-27 00:45:12 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:45:12.364844 | orchestrator | 2025-09-27 00:45:12 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:12.365372 | orchestrator | 2025-09-27 00:45:12 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:12.366771 | orchestrator | 2025-09-27 00:45:12 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:12.368152 | orchestrator | 2025-09-27 00:45:12 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:12.368308 | orchestrator | 2025-09-27 00:45:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:15.405663 | orchestrator | 2025-09-27 00:45:15 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:45:15.406402 | orchestrator | 2025-09-27 00:45:15 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:15.410171 | orchestrator | 2025-09-27 00:45:15 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:15.412712 | orchestrator | 2025-09-27 00:45:15 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:15.413456 | orchestrator | 2025-09-27 00:45:15 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:15.413576 | orchestrator | 2025-09-27 00:45:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:18.440916 | orchestrator | 2025-09-27 00:45:18 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state STARTED 2025-09-27 00:45:18.441273 | orchestrator | 2025-09-27 00:45:18 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:18.441696 | orchestrator | 2025-09-27 00:45:18 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:18.442336 | orchestrator | 2025-09-27 00:45:18 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:18.442898 | orchestrator | 2025-09-27 00:45:18 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:18.443023 | orchestrator | 2025-09-27 00:45:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:21.496681 | orchestrator | 2025-09-27 00:45:21.496777 | orchestrator | 2025-09-27 00:45:21.496793 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:45:21.496806 | orchestrator | 2025-09-27 00:45:21.496817 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:45:21.496829 | orchestrator | Saturday 27 September 2025 00:44:05 +0000 (0:00:00.993) 0:00:00.993 **** 2025-09-27 00:45:21.496840 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:45:21.496852 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:45:21.496863 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:45:21.496874 | orchestrator | 2025-09-27 00:45:21.496885 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:45:21.496896 | orchestrator | Saturday 27 September 2025 00:44:06 +0000 (0:00:00.972) 0:00:01.966 **** 2025-09-27 00:45:21.496907 | orchestrator | ok: [testbed-node-0] => (item=enable_memcached_True) 2025-09-27 00:45:21.496918 | orchestrator | ok: [testbed-node-1] => (item=enable_memcached_True) 2025-09-27 00:45:21.496929 | orchestrator | ok: [testbed-node-2] => (item=enable_memcached_True) 2025-09-27 00:45:21.496939 | orchestrator | 2025-09-27 00:45:21.496950 | orchestrator | PLAY [Apply role memcached] **************************************************** 2025-09-27 00:45:21.496961 | orchestrator | 2025-09-27 00:45:21.496972 | orchestrator | TASK [memcached : include_tasks] *********************************************** 2025-09-27 00:45:21.496983 | orchestrator | Saturday 27 September 2025 00:44:07 +0000 (0:00:00.962) 0:00:02.928 **** 2025-09-27 00:45:21.496994 | orchestrator | included: /ansible/roles/memcached/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:45:21.497005 | orchestrator | 2025-09-27 00:45:21.497124 | orchestrator | TASK [memcached : Ensuring config directories exist] *************************** 2025-09-27 00:45:21.497138 | orchestrator | Saturday 27 September 2025 00:44:10 +0000 (0:00:03.158) 0:00:06.087 **** 2025-09-27 00:45:21.497150 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2025-09-27 00:45:21.497161 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2025-09-27 00:45:21.497172 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2025-09-27 00:45:21.497183 | orchestrator | 2025-09-27 00:45:21.497193 | orchestrator | TASK [memcached : Copying over config.json files for services] ***************** 2025-09-27 00:45:21.497204 | orchestrator | Saturday 27 September 2025 00:44:12 +0000 (0:00:01.832) 0:00:07.919 **** 2025-09-27 00:45:21.497262 | orchestrator | changed: [testbed-node-0] => (item=memcached) 2025-09-27 00:45:21.497281 | orchestrator | changed: [testbed-node-1] => (item=memcached) 2025-09-27 00:45:21.497299 | orchestrator | changed: [testbed-node-2] => (item=memcached) 2025-09-27 00:45:21.497316 | orchestrator | 2025-09-27 00:45:21.497333 | orchestrator | TASK [memcached : Check memcached container] *********************************** 2025-09-27 00:45:21.497363 | orchestrator | Saturday 27 September 2025 00:44:15 +0000 (0:00:02.928) 0:00:10.848 **** 2025-09-27 00:45:21.497383 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:45:21.497402 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:45:21.497420 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:45:21.497431 | orchestrator | 2025-09-27 00:45:21.497442 | orchestrator | RUNNING HANDLER [memcached : Restart memcached container] ********************** 2025-09-27 00:45:21.497453 | orchestrator | Saturday 27 September 2025 00:44:19 +0000 (0:00:04.048) 0:00:14.897 **** 2025-09-27 00:45:21.497464 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:45:21.497474 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:45:21.497485 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:45:21.497496 | orchestrator | 2025-09-27 00:45:21.497506 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:45:21.497517 | orchestrator | testbed-node-0 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:45:21.497530 | orchestrator | testbed-node-1 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:45:21.497561 | orchestrator | testbed-node-2 : ok=7  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:45:21.497572 | orchestrator | 2025-09-27 00:45:21.497583 | orchestrator | 2025-09-27 00:45:21.497594 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:45:21.497604 | orchestrator | Saturday 27 September 2025 00:44:44 +0000 (0:00:25.001) 0:00:39.898 **** 2025-09-27 00:45:21.497615 | orchestrator | =============================================================================== 2025-09-27 00:45:21.497626 | orchestrator | memcached : Restart memcached container -------------------------------- 25.00s 2025-09-27 00:45:21.497637 | orchestrator | memcached : Check memcached container ----------------------------------- 4.05s 2025-09-27 00:45:21.497648 | orchestrator | memcached : include_tasks ----------------------------------------------- 3.16s 2025-09-27 00:45:21.497658 | orchestrator | memcached : Copying over config.json files for services ----------------- 2.93s 2025-09-27 00:45:21.497669 | orchestrator | memcached : Ensuring config directories exist --------------------------- 1.83s 2025-09-27 00:45:21.497679 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.97s 2025-09-27 00:45:21.497690 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.96s 2025-09-27 00:45:21.497701 | orchestrator | 2025-09-27 00:45:21.497711 | orchestrator | 2025-09-27 00:45:21.497722 | orchestrator | PLAY [Apply role phpmyadmin] *************************************************** 2025-09-27 00:45:21.497732 | orchestrator | 2025-09-27 00:45:21.497743 | orchestrator | TASK [osism.services.phpmyadmin : Create traefik external network] ************* 2025-09-27 00:45:21.497756 | orchestrator | Saturday 27 September 2025 00:43:56 +0000 (0:00:00.218) 0:00:00.218 **** 2025-09-27 00:45:21.497768 | orchestrator | ok: [testbed-manager] 2025-09-27 00:45:21.497781 | orchestrator | 2025-09-27 00:45:21.497793 | orchestrator | TASK [osism.services.phpmyadmin : Create required directories] ***************** 2025-09-27 00:45:21.497804 | orchestrator | Saturday 27 September 2025 00:43:56 +0000 (0:00:00.779) 0:00:00.998 **** 2025-09-27 00:45:21.497837 | orchestrator | changed: [testbed-manager] => (item=/opt/phpmyadmin) 2025-09-27 00:45:21.497850 | orchestrator | 2025-09-27 00:45:21.497862 | orchestrator | TASK [osism.services.phpmyadmin : Copy docker-compose.yml file] **************** 2025-09-27 00:45:21.497874 | orchestrator | Saturday 27 September 2025 00:43:57 +0000 (0:00:00.558) 0:00:01.556 **** 2025-09-27 00:45:21.497887 | orchestrator | changed: [testbed-manager] 2025-09-27 00:45:21.497899 | orchestrator | 2025-09-27 00:45:21.497911 | orchestrator | TASK [osism.services.phpmyadmin : Manage phpmyadmin service] ******************* 2025-09-27 00:45:21.497923 | orchestrator | Saturday 27 September 2025 00:43:58 +0000 (0:00:01.469) 0:00:03.026 **** 2025-09-27 00:45:21.497935 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage phpmyadmin service (10 retries left). 2025-09-27 00:45:21.497953 | orchestrator | ok: [testbed-manager] 2025-09-27 00:45:21.498153 | orchestrator | 2025-09-27 00:45:21.498178 | orchestrator | RUNNING HANDLER [osism.services.phpmyadmin : Restart phpmyadmin service] ******* 2025-09-27 00:45:21.498189 | orchestrator | Saturday 27 September 2025 00:44:42 +0000 (0:00:43.723) 0:00:46.749 **** 2025-09-27 00:45:21.498200 | orchestrator | changed: [testbed-manager] 2025-09-27 00:45:21.498231 | orchestrator | 2025-09-27 00:45:21.498278 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:45:21.498291 | orchestrator | testbed-manager : ok=5  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:45:21.498303 | orchestrator | 2025-09-27 00:45:21.498313 | orchestrator | 2025-09-27 00:45:21.498324 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:45:21.498335 | orchestrator | Saturday 27 September 2025 00:44:51 +0000 (0:00:09.171) 0:00:55.920 **** 2025-09-27 00:45:21.498346 | orchestrator | =============================================================================== 2025-09-27 00:45:21.498356 | orchestrator | osism.services.phpmyadmin : Manage phpmyadmin service ------------------ 43.72s 2025-09-27 00:45:21.498379 | orchestrator | osism.services.phpmyadmin : Restart phpmyadmin service ------------------ 9.17s 2025-09-27 00:45:21.498390 | orchestrator | osism.services.phpmyadmin : Copy docker-compose.yml file ---------------- 1.47s 2025-09-27 00:45:21.498401 | orchestrator | osism.services.phpmyadmin : Create traefik external network ------------- 0.78s 2025-09-27 00:45:21.498412 | orchestrator | osism.services.phpmyadmin : Create required directories ----------------- 0.56s 2025-09-27 00:45:21.498423 | orchestrator | 2025-09-27 00:45:21.498433 | orchestrator | 2025-09-27 00:45:21.498444 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:45:21.498454 | orchestrator | 2025-09-27 00:45:21.498465 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:45:21.498476 | orchestrator | Saturday 27 September 2025 00:44:10 +0000 (0:00:01.817) 0:00:01.817 **** 2025-09-27 00:45:21.498494 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:45:21.498505 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:45:21.498516 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:45:21.498527 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:45:21.498537 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:45:21.498548 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:45:21.498559 | orchestrator | 2025-09-27 00:45:21.498570 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:45:21.498580 | orchestrator | Saturday 27 September 2025 00:44:12 +0000 (0:00:02.337) 0:00:04.154 **** 2025-09-27 00:45:21.498591 | orchestrator | ok: [testbed-node-0] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-27 00:45:21.498602 | orchestrator | ok: [testbed-node-1] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-27 00:45:21.498613 | orchestrator | ok: [testbed-node-2] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-27 00:45:21.498624 | orchestrator | ok: [testbed-node-3] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-27 00:45:21.498634 | orchestrator | ok: [testbed-node-4] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-27 00:45:21.498645 | orchestrator | ok: [testbed-node-5] => (item=enable_openvswitch_True_enable_ovs_dpdk_False) 2025-09-27 00:45:21.498656 | orchestrator | 2025-09-27 00:45:21.498666 | orchestrator | PLAY [Apply role openvswitch] ************************************************** 2025-09-27 00:45:21.498677 | orchestrator | 2025-09-27 00:45:21.498687 | orchestrator | TASK [openvswitch : include_tasks] ********************************************* 2025-09-27 00:45:21.498698 | orchestrator | Saturday 27 September 2025 00:44:14 +0000 (0:00:02.314) 0:00:06.468 **** 2025-09-27 00:45:21.498710 | orchestrator | included: /ansible/roles/openvswitch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:45:21.498723 | orchestrator | 2025-09-27 00:45:21.498733 | orchestrator | TASK [module-load : Load modules] ********************************************** 2025-09-27 00:45:21.498744 | orchestrator | Saturday 27 September 2025 00:44:17 +0000 (0:00:03.199) 0:00:09.668 **** 2025-09-27 00:45:21.498755 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2025-09-27 00:45:21.498766 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2025-09-27 00:45:21.498776 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2025-09-27 00:45:21.498787 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2025-09-27 00:45:21.498798 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2025-09-27 00:45:21.498809 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2025-09-27 00:45:21.498819 | orchestrator | 2025-09-27 00:45:21.498830 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2025-09-27 00:45:21.498841 | orchestrator | Saturday 27 September 2025 00:44:21 +0000 (0:00:03.653) 0:00:13.321 **** 2025-09-27 00:45:21.498851 | orchestrator | changed: [testbed-node-2] => (item=openvswitch) 2025-09-27 00:45:21.498862 | orchestrator | changed: [testbed-node-1] => (item=openvswitch) 2025-09-27 00:45:21.498873 | orchestrator | changed: [testbed-node-3] => (item=openvswitch) 2025-09-27 00:45:21.499050 | orchestrator | changed: [testbed-node-0] => (item=openvswitch) 2025-09-27 00:45:21.499084 | orchestrator | changed: [testbed-node-4] => (item=openvswitch) 2025-09-27 00:45:21.499096 | orchestrator | changed: [testbed-node-5] => (item=openvswitch) 2025-09-27 00:45:21.499106 | orchestrator | 2025-09-27 00:45:21.499117 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2025-09-27 00:45:21.499128 | orchestrator | Saturday 27 September 2025 00:44:24 +0000 (0:00:02.899) 0:00:16.220 **** 2025-09-27 00:45:21.499139 | orchestrator | skipping: [testbed-node-0] => (item=openvswitch)  2025-09-27 00:45:21.499150 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:45:21.499161 | orchestrator | skipping: [testbed-node-1] => (item=openvswitch)  2025-09-27 00:45:21.499171 | orchestrator | skipping: [testbed-node-2] => (item=openvswitch)  2025-09-27 00:45:21.499182 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:45:21.499193 | orchestrator | skipping: [testbed-node-3] => (item=openvswitch)  2025-09-27 00:45:21.499203 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:45:21.499269 | orchestrator | skipping: [testbed-node-4] => (item=openvswitch)  2025-09-27 00:45:21.499280 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:45:21.499291 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:45:21.499302 | orchestrator | skipping: [testbed-node-5] => (item=openvswitch)  2025-09-27 00:45:21.499313 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:45:21.499323 | orchestrator | 2025-09-27 00:45:21.499334 | orchestrator | TASK [openvswitch : Create /run/openvswitch directory on host] ***************** 2025-09-27 00:45:21.499345 | orchestrator | Saturday 27 September 2025 00:44:27 +0000 (0:00:02.732) 0:00:18.952 **** 2025-09-27 00:45:21.499356 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:45:21.499366 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:45:21.499377 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:45:21.499388 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:45:21.499398 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:45:21.499409 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:45:21.499420 | orchestrator | 2025-09-27 00:45:21.499431 | orchestrator | TASK [openvswitch : Ensuring config directories exist] ************************* 2025-09-27 00:45:21.499442 | orchestrator | Saturday 27 September 2025 00:44:28 +0000 (0:00:00.863) 0:00:19.816 **** 2025-09-27 00:45:21.499462 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499482 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499494 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499522 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499534 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499546 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499563 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499575 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499593 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499623 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499635 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499646 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499658 | orchestrator | 2025-09-27 00:45:21.499669 | orchestrator | TASK [openvswitch : Copying over config.json files for services] *************** 2025-09-27 00:45:21.499679 | orchestrator | Saturday 27 September 2025 00:44:30 +0000 (0:00:01.968) 0:00:21.784 **** 2025-09-27 00:45:21.499693 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499704 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499720 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499738 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499748 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499763 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499774 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499790 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499800 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499816 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499827 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499841 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499851 | orchestrator | 2025-09-27 00:45:21.499861 | orchestrator | TASK [openvswitch : Copying over ovs-vsctl wrapper] **************************** 2025-09-27 00:45:21.499871 | orchestrator | Saturday 27 September 2025 00:44:33 +0000 (0:00:03.856) 0:00:25.641 **** 2025-09-27 00:45:21.499881 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:45:21.499891 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:45:21.499911 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:45:21.499921 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:45:21.499930 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:45:21.499940 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:45:21.499949 | orchestrator | 2025-09-27 00:45:21.499959 | orchestrator | TASK [openvswitch : Check openvswitch containers] ****************************** 2025-09-27 00:45:21.499969 | orchestrator | Saturday 27 September 2025 00:44:35 +0000 (0:00:01.488) 0:00:27.129 **** 2025-09-27 00:45:21.499979 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.499989 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500005 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500016 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500026 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500048 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500059 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-db-server', 'value': {'container_name': 'openvswitch_db', 'image': 'registry.osism.tech/kolla/openvswitch-db-server:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'volumes': ['/etc/kolla/openvswitch-db-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', 'openvswitch_db:/var/lib/openvswitch/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovsdb-client list-dbs'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500074 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500085 | orchestrator | changed: [testbed-node-3] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500095 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500109 | orchestrator | changed: [testbed-node-4] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500125 | orchestrator | changed: [testbed-node-5] => (item={'key': 'openvswitch-vswitchd', 'value': {'container_name': 'openvswitch_vswitchd', 'image': 'registry.osism.tech/kolla/openvswitch-vswitchd:2024.2', 'enabled': True, 'group': 'openvswitch', 'host_in_groups': True, 'privileged': True, 'volumes': ['/etc/kolla/openvswitch-vswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'ovs-appctl version'], 'timeout': '30'}}}) 2025-09-27 00:45:21.500135 | orchestrator | 2025-09-27 00:45:21.500145 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-27 00:45:21.500155 | orchestrator | Saturday 27 September 2025 00:44:37 +0000 (0:00:02.305) 0:00:29.435 **** 2025-09-27 00:45:21.500165 | orchestrator | 2025-09-27 00:45:21.500174 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-27 00:45:21.500184 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:00.437) 0:00:29.872 **** 2025-09-27 00:45:21.500193 | orchestrator | 2025-09-27 00:45:21.500203 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-27 00:45:21.500230 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:00.118) 0:00:29.991 **** 2025-09-27 00:45:21.500239 | orchestrator | 2025-09-27 00:45:21.500249 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-27 00:45:21.500259 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:00.109) 0:00:30.100 **** 2025-09-27 00:45:21.500268 | orchestrator | 2025-09-27 00:45:21.500278 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-27 00:45:21.500288 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:00.103) 0:00:30.203 **** 2025-09-27 00:45:21.500297 | orchestrator | 2025-09-27 00:45:21.500307 | orchestrator | TASK [openvswitch : Flush Handlers] ******************************************** 2025-09-27 00:45:21.500317 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:00.101) 0:00:30.305 **** 2025-09-27 00:45:21.500326 | orchestrator | 2025-09-27 00:45:21.500336 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-db-server container] ******** 2025-09-27 00:45:21.500346 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:00.101) 0:00:30.406 **** 2025-09-27 00:45:21.500355 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:45:21.500365 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:45:21.500375 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:45:21.500384 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:45:21.500400 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:45:21.500410 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:45:21.500420 | orchestrator | 2025-09-27 00:45:21.500429 | orchestrator | RUNNING HANDLER [openvswitch : Waiting for openvswitch_db service to be ready] *** 2025-09-27 00:45:21.500439 | orchestrator | Saturday 27 September 2025 00:45:03 +0000 (0:00:25.115) 0:00:55.522 **** 2025-09-27 00:45:21.500449 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:45:21.500459 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:45:21.500468 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:45:21.500478 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:45:21.500487 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:45:21.500497 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:45:21.500506 | orchestrator | 2025-09-27 00:45:21.500516 | orchestrator | RUNNING HANDLER [openvswitch : Restart openvswitch-vswitchd container] ********* 2025-09-27 00:45:21.500526 | orchestrator | Saturday 27 September 2025 00:45:05 +0000 (0:00:01.884) 0:00:57.407 **** 2025-09-27 00:45:21.500541 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:45:21.500551 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:45:21.500560 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:45:21.500570 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:45:21.500579 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:45:21.500589 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:45:21.500598 | orchestrator | 2025-09-27 00:45:21.500608 | orchestrator | TASK [openvswitch : Set system-id, hostname and hw-offload] ******************** 2025-09-27 00:45:21.500618 | orchestrator | Saturday 27 September 2025 00:45:16 +0000 (0:00:10.359) 0:01:07.767 **** 2025-09-27 00:45:21.500629 | orchestrator | failed: [testbed-node-0] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-0'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-0"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500645 | orchestrator | failed: [testbed-node-1] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-1'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-1"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500655 | orchestrator | failed: [testbed-node-2] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-2'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-2"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500669 | orchestrator | failed: [testbed-node-3] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-3'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-3"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500679 | orchestrator | failed: [testbed-node-4] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-4'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-4"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500689 | orchestrator | failed: [testbed-node-5] (item={'col': 'external_ids', 'name': 'system-id', 'value': 'testbed-node-5'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "system-id", "value": "testbed-node-5"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500699 | orchestrator | failed: [testbed-node-1] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-1'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-1"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500709 | orchestrator | failed: [testbed-node-0] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-0'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-0"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500719 | orchestrator | failed: [testbed-node-3] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-3'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-3"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500728 | orchestrator | failed: [testbed-node-2] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-2'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-2"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500743 | orchestrator | failed: [testbed-node-4] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-4'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-4"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500760 | orchestrator | failed: [testbed-node-5] (item={'col': 'external_ids', 'name': 'hostname', 'value': 'testbed-node-5'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "external_ids", "name": "hostname", "value": "testbed-node-5"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500770 | orchestrator | failed: [testbed-node-1] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500781 | orchestrator | failed: [testbed-node-0] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500791 | orchestrator | failed: [testbed-node-3] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500801 | orchestrator | failed: [testbed-node-2] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500815 | orchestrator | failed: [testbed-node-4] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500824 | orchestrator | failed: [testbed-node-5] (item={'col': 'other_config', 'name': 'hw-offload', 'value': True, 'state': 'absent'}) => {"ansible_loop_var": "item", "changed": false, "item": {"col": "other_config", "name": "hw-offload", "state": "absent", "value": true}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:21.500834 | orchestrator | 2025-09-27 00:45:21.500844 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:45:21.500854 | orchestrator | testbed-node-0 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-27 00:45:21.500865 | orchestrator | testbed-node-1 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-27 00:45:21.500874 | orchestrator | testbed-node-2 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-27 00:45:21.500884 | orchestrator | testbed-node-3 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-27 00:45:21.500894 | orchestrator | testbed-node-4 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-27 00:45:21.500903 | orchestrator | testbed-node-5 : ok=11  changed=7  unreachable=0 failed=1  skipped=3  rescued=0 ignored=0 2025-09-27 00:45:21.500913 | orchestrator | 2025-09-27 00:45:21.500922 | orchestrator | 2025-09-27 00:45:21.500932 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:45:21.500942 | orchestrator | Saturday 27 September 2025 00:45:18 +0000 (0:00:02.479) 0:01:10.246 **** 2025-09-27 00:45:21.500956 | orchestrator | =============================================================================== 2025-09-27 00:45:21.500966 | orchestrator | openvswitch : Restart openvswitch-db-server container ------------------ 25.12s 2025-09-27 00:45:21.500975 | orchestrator | openvswitch : Restart openvswitch-vswitchd container ------------------- 10.36s 2025-09-27 00:45:21.500985 | orchestrator | openvswitch : Copying over config.json files for services --------------- 3.86s 2025-09-27 00:45:21.500994 | orchestrator | module-load : Load modules ---------------------------------------------- 3.65s 2025-09-27 00:45:21.501004 | orchestrator | openvswitch : include_tasks --------------------------------------------- 3.20s 2025-09-27 00:45:21.501019 | orchestrator | module-load : Persist modules via modules-load.d ------------------------ 2.90s 2025-09-27 00:45:21.501029 | orchestrator | module-load : Drop module persistence ----------------------------------- 2.73s 2025-09-27 00:45:21.501038 | orchestrator | openvswitch : Set system-id, hostname and hw-offload -------------------- 2.48s 2025-09-27 00:45:21.501048 | orchestrator | Group hosts based on Kolla action --------------------------------------- 2.34s 2025-09-27 00:45:21.501058 | orchestrator | Group hosts based on enabled services ----------------------------------- 2.31s 2025-09-27 00:45:21.501068 | orchestrator | openvswitch : Check openvswitch containers ------------------------------ 2.31s 2025-09-27 00:45:21.501077 | orchestrator | openvswitch : Ensuring config directories exist ------------------------- 1.97s 2025-09-27 00:45:21.501087 | orchestrator | openvswitch : Waiting for openvswitch_db service to be ready ------------ 1.88s 2025-09-27 00:45:21.501097 | orchestrator | openvswitch : Copying over ovs-vsctl wrapper ---------------------------- 1.49s 2025-09-27 00:45:21.501106 | orchestrator | openvswitch : Flush Handlers -------------------------------------------- 0.97s 2025-09-27 00:45:21.501116 | orchestrator | openvswitch : Create /run/openvswitch directory on host ----------------- 0.86s 2025-09-27 00:45:21.501125 | orchestrator | 2025-09-27 00:45:21 | INFO  | Task ce7cb7a8-3653-4424-bfed-7e1a3f9348b1 is in state SUCCESS 2025-09-27 00:45:21.501135 | orchestrator | 2025-09-27 00:45:21 | INFO  | Task 9bb7b69e-840d-4ad5-b3ec-d30a1abcbb15 is in state STARTED 2025-09-27 00:45:21.501145 | orchestrator | 2025-09-27 00:45:21 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:21.501155 | orchestrator | 2025-09-27 00:45:21 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:21.501164 | orchestrator | 2025-09-27 00:45:21 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:21.501174 | orchestrator | 2025-09-27 00:45:21 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:21.501184 | orchestrator | 2025-09-27 00:45:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:24.530766 | orchestrator | 2025-09-27 00:45:24 | INFO  | Task 9bb7b69e-840d-4ad5-b3ec-d30a1abcbb15 is in state STARTED 2025-09-27 00:45:24.532358 | orchestrator | 2025-09-27 00:45:24 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:24.534497 | orchestrator | 2025-09-27 00:45:24 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:24.535654 | orchestrator | 2025-09-27 00:45:24 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:24.537978 | orchestrator | 2025-09-27 00:45:24 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:24.538332 | orchestrator | 2025-09-27 00:45:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:27.560468 | orchestrator | 2025-09-27 00:45:27 | INFO  | Task 9bb7b69e-840d-4ad5-b3ec-d30a1abcbb15 is in state STARTED 2025-09-27 00:45:27.561275 | orchestrator | 2025-09-27 00:45:27 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:27.561899 | orchestrator | 2025-09-27 00:45:27 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:27.562652 | orchestrator | 2025-09-27 00:45:27 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:27.564809 | orchestrator | 2025-09-27 00:45:27 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:27.564859 | orchestrator | 2025-09-27 00:45:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:30.585751 | orchestrator | 2025-09-27 00:45:30 | INFO  | Task 9bb7b69e-840d-4ad5-b3ec-d30a1abcbb15 is in state STARTED 2025-09-27 00:45:30.592001 | orchestrator | 2025-09-27 00:45:30 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:30.592903 | orchestrator | 2025-09-27 00:45:30 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:30.594110 | orchestrator | 2025-09-27 00:45:30 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:30.595152 | orchestrator | 2025-09-27 00:45:30 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:30.595178 | orchestrator | 2025-09-27 00:45:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:33.635455 | orchestrator | 2025-09-27 00:45:33.635578 | orchestrator | 2025-09-27 00:45:33.635593 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:45:33.635606 | orchestrator | 2025-09-27 00:45:33.635618 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:45:33.635630 | orchestrator | Saturday 27 September 2025 00:45:22 +0000 (0:00:00.167) 0:00:00.167 **** 2025-09-27 00:45:33.635641 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:45:33.635654 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:45:33.635665 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:45:33.635676 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:45:33.635687 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:45:33.635698 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:45:33.635709 | orchestrator | 2025-09-27 00:45:33.635720 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:45:33.635732 | orchestrator | Saturday 27 September 2025 00:45:23 +0000 (0:00:00.640) 0:00:00.807 **** 2025-09-27 00:45:33.635743 | orchestrator | ok: [testbed-node-0] => (item=enable_ovn_True) 2025-09-27 00:45:33.635755 | orchestrator | ok: [testbed-node-1] => (item=enable_ovn_True) 2025-09-27 00:45:33.635868 | orchestrator | ok: [testbed-node-2] => (item=enable_ovn_True) 2025-09-27 00:45:33.635881 | orchestrator | ok: [testbed-node-3] => (item=enable_ovn_True) 2025-09-27 00:45:33.635892 | orchestrator | ok: [testbed-node-4] => (item=enable_ovn_True) 2025-09-27 00:45:33.635903 | orchestrator | ok: [testbed-node-5] => (item=enable_ovn_True) 2025-09-27 00:45:33.635914 | orchestrator | 2025-09-27 00:45:33.635925 | orchestrator | PLAY [Apply role ovn-controller] *********************************************** 2025-09-27 00:45:33.635936 | orchestrator | 2025-09-27 00:45:33.635946 | orchestrator | TASK [ovn-controller : include_tasks] ****************************************** 2025-09-27 00:45:33.635957 | orchestrator | Saturday 27 September 2025 00:45:23 +0000 (0:00:00.857) 0:00:01.665 **** 2025-09-27 00:45:33.635970 | orchestrator | included: /ansible/roles/ovn-controller/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:45:33.635983 | orchestrator | 2025-09-27 00:45:33.635994 | orchestrator | TASK [ovn-controller : Ensuring config directories exist] ********************** 2025-09-27 00:45:33.636005 | orchestrator | Saturday 27 September 2025 00:45:24 +0000 (0:00:00.958) 0:00:02.623 **** 2025-09-27 00:45:33.636018 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636082 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636095 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636107 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636118 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636129 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636140 | orchestrator | 2025-09-27 00:45:33.636172 | orchestrator | TASK [ovn-controller : Copying over config.json files for services] ************ 2025-09-27 00:45:33.636184 | orchestrator | Saturday 27 September 2025 00:45:25 +0000 (0:00:01.002) 0:00:03.625 **** 2025-09-27 00:45:33.636195 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636231 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636243 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636263 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636280 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636291 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636302 | orchestrator | 2025-09-27 00:45:33.636313 | orchestrator | TASK [ovn-controller : Ensuring systemd override directory exists] ************* 2025-09-27 00:45:33.636324 | orchestrator | Saturday 27 September 2025 00:45:27 +0000 (0:00:01.569) 0:00:05.194 **** 2025-09-27 00:45:33.636335 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636347 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636365 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636376 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636387 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636399 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636417 | orchestrator | 2025-09-27 00:45:33.636428 | orchestrator | TASK [ovn-controller : Copying over systemd override] ************************** 2025-09-27 00:45:33.636439 | orchestrator | Saturday 27 September 2025 00:45:28 +0000 (0:00:01.201) 0:00:06.396 **** 2025-09-27 00:45:33.636450 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636466 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636478 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636490 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636502 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636515 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636527 | orchestrator | 2025-09-27 00:45:33.636545 | orchestrator | TASK [ovn-controller : Check ovn-controller containers] ************************ 2025-09-27 00:45:33.636558 | orchestrator | Saturday 27 September 2025 00:45:30 +0000 (0:00:01.606) 0:00:08.002 **** 2025-09-27 00:45:33.636571 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636585 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636607 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636620 | orchestrator | changed: [testbed-node-3] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636638 | orchestrator | changed: [testbed-node-4] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636651 | orchestrator | changed: [testbed-node-5] => (item={'key': 'ovn-controller', 'value': {'container_name': 'ovn_controller', 'group': 'ovn-controller', 'enabled': True, 'image': 'registry.osism.tech/kolla/ovn-controller:2024.2', 'volumes': ['/etc/kolla/ovn-controller/:/var/lib/kolla/config_files/:ro', '/run/openvswitch:/run/openvswitch:shared', '/etc/localtime:/etc/localtime:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:45:33.636664 | orchestrator | 2025-09-27 00:45:33.636677 | orchestrator | TASK [ovn-controller : Create br-int bridge on OpenvSwitch] ******************** 2025-09-27 00:45:33.636689 | orchestrator | Saturday 27 September 2025 00:45:31 +0000 (0:00:01.315) 0:00:09.318 **** 2025-09-27 00:45:33.636703 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:33.636716 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:33.636728 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:33.636740 | orchestrator | fatal: [testbed-node-3]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:33.636752 | orchestrator | fatal: [testbed-node-4]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:33.636765 | orchestrator | fatal: [testbed-node-5]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:45:33.636777 | orchestrator | 2025-09-27 00:45:33.636788 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:45:33.636801 | orchestrator | testbed-node-0 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:45:33.636816 | orchestrator | testbed-node-1 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:45:33.636835 | orchestrator | testbed-node-2 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:45:33.636848 | orchestrator | testbed-node-3 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:45:33.636867 | orchestrator | testbed-node-4 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:45:33.636878 | orchestrator | testbed-node-5 : ok=8  changed=5  unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:45:33.636888 | orchestrator | 2025-09-27 00:45:33.636899 | orchestrator | 2025-09-27 00:45:33.636910 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:45:33.636921 | orchestrator | Saturday 27 September 2025 00:45:32 +0000 (0:00:01.203) 0:00:10.522 **** 2025-09-27 00:45:33.636932 | orchestrator | =============================================================================== 2025-09-27 00:45:33.636943 | orchestrator | ovn-controller : Copying over systemd override -------------------------- 1.61s 2025-09-27 00:45:33.636954 | orchestrator | ovn-controller : Copying over config.json files for services ------------ 1.57s 2025-09-27 00:45:33.636964 | orchestrator | ovn-controller : Check ovn-controller containers ------------------------ 1.32s 2025-09-27 00:45:33.636975 | orchestrator | ovn-controller : Create br-int bridge on OpenvSwitch -------------------- 1.20s 2025-09-27 00:45:33.636986 | orchestrator | ovn-controller : Ensuring systemd override directory exists ------------- 1.20s 2025-09-27 00:45:33.636996 | orchestrator | ovn-controller : Ensuring config directories exist ---------------------- 1.00s 2025-09-27 00:45:33.637012 | orchestrator | ovn-controller : include_tasks ------------------------------------------ 0.96s 2025-09-27 00:45:33.637023 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.86s 2025-09-27 00:45:33.637033 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.64s 2025-09-27 00:45:33.637044 | orchestrator | 2025-09-27 00:45:33 | INFO  | Task 9bb7b69e-840d-4ad5-b3ec-d30a1abcbb15 is in state SUCCESS 2025-09-27 00:45:33.637055 | orchestrator | 2025-09-27 00:45:33 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:33.638459 | orchestrator | 2025-09-27 00:45:33 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:33.640958 | orchestrator | 2025-09-27 00:45:33 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:33.644067 | orchestrator | 2025-09-27 00:45:33 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:33.644168 | orchestrator | 2025-09-27 00:45:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:36.689707 | orchestrator | 2025-09-27 00:45:36 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:36.691800 | orchestrator | 2025-09-27 00:45:36 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:36.694675 | orchestrator | 2025-09-27 00:45:36 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:36.697477 | orchestrator | 2025-09-27 00:45:36 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:36.697505 | orchestrator | 2025-09-27 00:45:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:39.741769 | orchestrator | 2025-09-27 00:45:39 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:39.743516 | orchestrator | 2025-09-27 00:45:39 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:39.746665 | orchestrator | 2025-09-27 00:45:39 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:39.749551 | orchestrator | 2025-09-27 00:45:39 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:39.749573 | orchestrator | 2025-09-27 00:45:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:42.784444 | orchestrator | 2025-09-27 00:45:42 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:42.786063 | orchestrator | 2025-09-27 00:45:42 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:42.787765 | orchestrator | 2025-09-27 00:45:42 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:42.789412 | orchestrator | 2025-09-27 00:45:42 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:42.789612 | orchestrator | 2025-09-27 00:45:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:45.923462 | orchestrator | 2025-09-27 00:45:45 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:45.923697 | orchestrator | 2025-09-27 00:45:45 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:45.927575 | orchestrator | 2025-09-27 00:45:45 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:45.928055 | orchestrator | 2025-09-27 00:45:45 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:45.928091 | orchestrator | 2025-09-27 00:45:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:48.966379 | orchestrator | 2025-09-27 00:45:48 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:48.966624 | orchestrator | 2025-09-27 00:45:48 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:48.967269 | orchestrator | 2025-09-27 00:45:48 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:48.967889 | orchestrator | 2025-09-27 00:45:48 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:48.967912 | orchestrator | 2025-09-27 00:45:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:51.992657 | orchestrator | 2025-09-27 00:45:51 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:51.993010 | orchestrator | 2025-09-27 00:45:51 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:51.993877 | orchestrator | 2025-09-27 00:45:51 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:51.994568 | orchestrator | 2025-09-27 00:45:51 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:51.994612 | orchestrator | 2025-09-27 00:45:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:55.048714 | orchestrator | 2025-09-27 00:45:55 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:55.049003 | orchestrator | 2025-09-27 00:45:55 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:55.049858 | orchestrator | 2025-09-27 00:45:55 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:55.050968 | orchestrator | 2025-09-27 00:45:55 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:55.050995 | orchestrator | 2025-09-27 00:45:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:45:58.093483 | orchestrator | 2025-09-27 00:45:58 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:45:58.093618 | orchestrator | 2025-09-27 00:45:58 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:45:58.094296 | orchestrator | 2025-09-27 00:45:58 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:45:58.095032 | orchestrator | 2025-09-27 00:45:58 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:45:58.095778 | orchestrator | 2025-09-27 00:45:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:01.122690 | orchestrator | 2025-09-27 00:46:01 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:01.125011 | orchestrator | 2025-09-27 00:46:01 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:01.128347 | orchestrator | 2025-09-27 00:46:01 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:01.130596 | orchestrator | 2025-09-27 00:46:01 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:01.130830 | orchestrator | 2025-09-27 00:46:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:04.172965 | orchestrator | 2025-09-27 00:46:04 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:04.174482 | orchestrator | 2025-09-27 00:46:04 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:04.175857 | orchestrator | 2025-09-27 00:46:04 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:04.177025 | orchestrator | 2025-09-27 00:46:04 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:04.177050 | orchestrator | 2025-09-27 00:46:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:07.214255 | orchestrator | 2025-09-27 00:46:07 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:07.215491 | orchestrator | 2025-09-27 00:46:07 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:07.216294 | orchestrator | 2025-09-27 00:46:07 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:07.217840 | orchestrator | 2025-09-27 00:46:07 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:07.217869 | orchestrator | 2025-09-27 00:46:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:10.252693 | orchestrator | 2025-09-27 00:46:10 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:10.253193 | orchestrator | 2025-09-27 00:46:10 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:10.254313 | orchestrator | 2025-09-27 00:46:10 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:10.255456 | orchestrator | 2025-09-27 00:46:10 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:10.255488 | orchestrator | 2025-09-27 00:46:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:13.290847 | orchestrator | 2025-09-27 00:46:13 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:13.290959 | orchestrator | 2025-09-27 00:46:13 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:13.290974 | orchestrator | 2025-09-27 00:46:13 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:13.290986 | orchestrator | 2025-09-27 00:46:13 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:13.290997 | orchestrator | 2025-09-27 00:46:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:16.321022 | orchestrator | 2025-09-27 00:46:16 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:16.321636 | orchestrator | 2025-09-27 00:46:16 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:16.322834 | orchestrator | 2025-09-27 00:46:16 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:16.323807 | orchestrator | 2025-09-27 00:46:16 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:16.323930 | orchestrator | 2025-09-27 00:46:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:19.365652 | orchestrator | 2025-09-27 00:46:19 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:19.366903 | orchestrator | 2025-09-27 00:46:19 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:19.370260 | orchestrator | 2025-09-27 00:46:19 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:19.371983 | orchestrator | 2025-09-27 00:46:19 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:19.372010 | orchestrator | 2025-09-27 00:46:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:22.454675 | orchestrator | 2025-09-27 00:46:22 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:22.454887 | orchestrator | 2025-09-27 00:46:22 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:22.454919 | orchestrator | 2025-09-27 00:46:22 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:22.455719 | orchestrator | 2025-09-27 00:46:22 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:22.455748 | orchestrator | 2025-09-27 00:46:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:25.524284 | orchestrator | 2025-09-27 00:46:25 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:25.524383 | orchestrator | 2025-09-27 00:46:25 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:25.524621 | orchestrator | 2025-09-27 00:46:25 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:25.525097 | orchestrator | 2025-09-27 00:46:25 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:25.525385 | orchestrator | 2025-09-27 00:46:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:28.563148 | orchestrator | 2025-09-27 00:46:28 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:28.564148 | orchestrator | 2025-09-27 00:46:28 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:28.564887 | orchestrator | 2025-09-27 00:46:28 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:28.565668 | orchestrator | 2025-09-27 00:46:28 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:28.565893 | orchestrator | 2025-09-27 00:46:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:31.605771 | orchestrator | 2025-09-27 00:46:31 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:31.606147 | orchestrator | 2025-09-27 00:46:31 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:31.606888 | orchestrator | 2025-09-27 00:46:31 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:31.609671 | orchestrator | 2025-09-27 00:46:31 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:31.609695 | orchestrator | 2025-09-27 00:46:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:34.643767 | orchestrator | 2025-09-27 00:46:34 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:34.644041 | orchestrator | 2025-09-27 00:46:34 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:34.645006 | orchestrator | 2025-09-27 00:46:34 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:34.646182 | orchestrator | 2025-09-27 00:46:34 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:34.646275 | orchestrator | 2025-09-27 00:46:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:37.683833 | orchestrator | 2025-09-27 00:46:37 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:37.685323 | orchestrator | 2025-09-27 00:46:37 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:37.687631 | orchestrator | 2025-09-27 00:46:37 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:37.688966 | orchestrator | 2025-09-27 00:46:37 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:37.689930 | orchestrator | 2025-09-27 00:46:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:40.726640 | orchestrator | 2025-09-27 00:46:40 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:40.731461 | orchestrator | 2025-09-27 00:46:40 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:40.732470 | orchestrator | 2025-09-27 00:46:40 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:40.734988 | orchestrator | 2025-09-27 00:46:40 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:40.735012 | orchestrator | 2025-09-27 00:46:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:43.787826 | orchestrator | 2025-09-27 00:46:43 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:43.789496 | orchestrator | 2025-09-27 00:46:43 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:43.791722 | orchestrator | 2025-09-27 00:46:43 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:43.792822 | orchestrator | 2025-09-27 00:46:43 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:43.792934 | orchestrator | 2025-09-27 00:46:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:46.831263 | orchestrator | 2025-09-27 00:46:46 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:46.831489 | orchestrator | 2025-09-27 00:46:46 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:46.831722 | orchestrator | 2025-09-27 00:46:46 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:46.832332 | orchestrator | 2025-09-27 00:46:46 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:46.832365 | orchestrator | 2025-09-27 00:46:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:49.856813 | orchestrator | 2025-09-27 00:46:49 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:49.857464 | orchestrator | 2025-09-27 00:46:49 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:49.861201 | orchestrator | 2025-09-27 00:46:49 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:49.865682 | orchestrator | 2025-09-27 00:46:49 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:49.865818 | orchestrator | 2025-09-27 00:46:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:52.919732 | orchestrator | 2025-09-27 00:46:52 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:52.919838 | orchestrator | 2025-09-27 00:46:52 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:52.919853 | orchestrator | 2025-09-27 00:46:52 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:52.921021 | orchestrator | 2025-09-27 00:46:52 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:52.921056 | orchestrator | 2025-09-27 00:46:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:55.979395 | orchestrator | 2025-09-27 00:46:55 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:55.980867 | orchestrator | 2025-09-27 00:46:55 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:55.983845 | orchestrator | 2025-09-27 00:46:55 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:55.984961 | orchestrator | 2025-09-27 00:46:55 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:55.984996 | orchestrator | 2025-09-27 00:46:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:46:59.045908 | orchestrator | 2025-09-27 00:46:59 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:46:59.046000 | orchestrator | 2025-09-27 00:46:59 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:46:59.049047 | orchestrator | 2025-09-27 00:46:59 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:46:59.049636 | orchestrator | 2025-09-27 00:46:59 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state STARTED 2025-09-27 00:46:59.049735 | orchestrator | 2025-09-27 00:46:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:02.157776 | orchestrator | 2025-09-27 00:47:02 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:02.158282 | orchestrator | 2025-09-27 00:47:02 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:02.158725 | orchestrator | 2025-09-27 00:47:02 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:47:02.159542 | orchestrator | 2025-09-27 00:47:02.159568 | orchestrator | 2025-09-27 00:47:02 | INFO  | Task 0bfbbc49-122f-42e2-b6bc-552d5dd0b6e2 is in state SUCCESS 2025-09-27 00:47:02.160611 | orchestrator | 2025-09-27 00:47:02.160641 | orchestrator | PLAY [Set kolla_action_rabbitmq] *********************************************** 2025-09-27 00:47:02.160654 | orchestrator | 2025-09-27 00:47:02.160666 | orchestrator | TASK [Inform the user about the following task] ******************************** 2025-09-27 00:47:02.160678 | orchestrator | Saturday 27 September 2025 00:44:49 +0000 (0:00:00.172) 0:00:00.172 **** 2025-09-27 00:47:02.160690 | orchestrator | ok: [localhost] => { 2025-09-27 00:47:02.160703 | orchestrator |  "msg": "The task 'Check RabbitMQ service' fails if the RabbitMQ service has not yet been deployed. This is fine." 2025-09-27 00:47:02.160715 | orchestrator | } 2025-09-27 00:47:02.160726 | orchestrator | 2025-09-27 00:47:02.160737 | orchestrator | TASK [Check RabbitMQ service] ************************************************** 2025-09-27 00:47:02.160748 | orchestrator | Saturday 27 September 2025 00:44:49 +0000 (0:00:00.034) 0:00:00.207 **** 2025-09-27 00:47:02.160760 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string RabbitMQ Management in 192.168.16.9:15672"} 2025-09-27 00:47:02.160797 | orchestrator | ...ignoring 2025-09-27 00:47:02.160810 | orchestrator | 2025-09-27 00:47:02.160821 | orchestrator | TASK [Set kolla_action_rabbitmq = upgrade if RabbitMQ is already running] ****** 2025-09-27 00:47:02.160832 | orchestrator | Saturday 27 September 2025 00:44:52 +0000 (0:00:02.834) 0:00:03.041 **** 2025-09-27 00:47:02.160843 | orchestrator | skipping: [localhost] 2025-09-27 00:47:02.160854 | orchestrator | 2025-09-27 00:47:02.160864 | orchestrator | TASK [Set kolla_action_rabbitmq = kolla_action_ng] ***************************** 2025-09-27 00:47:02.160875 | orchestrator | Saturday 27 September 2025 00:44:52 +0000 (0:00:00.040) 0:00:03.082 **** 2025-09-27 00:47:02.160886 | orchestrator | ok: [localhost] 2025-09-27 00:47:02.160897 | orchestrator | 2025-09-27 00:47:02.160907 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:47:02.160918 | orchestrator | 2025-09-27 00:47:02.160929 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:47:02.160939 | orchestrator | Saturday 27 September 2025 00:44:52 +0000 (0:00:00.173) 0:00:03.255 **** 2025-09-27 00:47:02.160950 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:02.160961 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:02.160971 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:02.160982 | orchestrator | 2025-09-27 00:47:02.160993 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:47:02.161003 | orchestrator | Saturday 27 September 2025 00:44:52 +0000 (0:00:00.358) 0:00:03.614 **** 2025-09-27 00:47:02.161014 | orchestrator | ok: [testbed-node-0] => (item=enable_rabbitmq_True) 2025-09-27 00:47:02.161025 | orchestrator | ok: [testbed-node-1] => (item=enable_rabbitmq_True) 2025-09-27 00:47:02.161036 | orchestrator | ok: [testbed-node-2] => (item=enable_rabbitmq_True) 2025-09-27 00:47:02.161046 | orchestrator | 2025-09-27 00:47:02.161057 | orchestrator | PLAY [Apply role rabbitmq] ***************************************************** 2025-09-27 00:47:02.161068 | orchestrator | 2025-09-27 00:47:02.161078 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-09-27 00:47:02.161089 | orchestrator | Saturday 27 September 2025 00:44:53 +0000 (0:00:00.414) 0:00:04.029 **** 2025-09-27 00:47:02.161100 | orchestrator | included: /ansible/roles/rabbitmq/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:47:02.161111 | orchestrator | 2025-09-27 00:47:02.161122 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2025-09-27 00:47:02.161132 | orchestrator | Saturday 27 September 2025 00:44:54 +0000 (0:00:00.643) 0:00:04.672 **** 2025-09-27 00:47:02.161143 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:02.161154 | orchestrator | 2025-09-27 00:47:02.161164 | orchestrator | TASK [rabbitmq : Get current RabbitMQ version] ********************************* 2025-09-27 00:47:02.161176 | orchestrator | Saturday 27 September 2025 00:44:54 +0000 (0:00:00.872) 0:00:05.545 **** 2025-09-27 00:47:02.161186 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:02.161198 | orchestrator | 2025-09-27 00:47:02.161234 | orchestrator | TASK [rabbitmq : Get new RabbitMQ version] ************************************* 2025-09-27 00:47:02.161248 | orchestrator | Saturday 27 September 2025 00:44:55 +0000 (0:00:00.364) 0:00:05.909 **** 2025-09-27 00:47:02.161260 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:02.161273 | orchestrator | 2025-09-27 00:47:02.161285 | orchestrator | TASK [rabbitmq : Check if running RabbitMQ is at most one version behind] ****** 2025-09-27 00:47:02.161298 | orchestrator | Saturday 27 September 2025 00:44:55 +0000 (0:00:00.369) 0:00:06.278 **** 2025-09-27 00:47:02.161310 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:02.161323 | orchestrator | 2025-09-27 00:47:02.161336 | orchestrator | TASK [rabbitmq : Catch when RabbitMQ is being downgraded] ********************** 2025-09-27 00:47:02.161348 | orchestrator | Saturday 27 September 2025 00:44:55 +0000 (0:00:00.350) 0:00:06.629 **** 2025-09-27 00:47:02.161360 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:02.161373 | orchestrator | 2025-09-27 00:47:02.161385 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-09-27 00:47:02.161397 | orchestrator | Saturday 27 September 2025 00:44:56 +0000 (0:00:00.336) 0:00:06.966 **** 2025-09-27 00:47:02.161418 | orchestrator | included: /ansible/roles/rabbitmq/tasks/remove-ha-all-policy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:47:02.161431 | orchestrator | 2025-09-27 00:47:02.161443 | orchestrator | TASK [rabbitmq : Get container facts] ****************************************** 2025-09-27 00:47:02.161456 | orchestrator | Saturday 27 September 2025 00:44:57 +0000 (0:00:00.748) 0:00:07.714 **** 2025-09-27 00:47:02.161468 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:02.161480 | orchestrator | 2025-09-27 00:47:02.161493 | orchestrator | TASK [rabbitmq : List RabbitMQ policies] *************************************** 2025-09-27 00:47:02.161506 | orchestrator | Saturday 27 September 2025 00:44:58 +0000 (0:00:00.923) 0:00:08.637 **** 2025-09-27 00:47:02.161518 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:02.161531 | orchestrator | 2025-09-27 00:47:02.161552 | orchestrator | TASK [rabbitmq : Remove ha-all policy from RabbitMQ] *************************** 2025-09-27 00:47:02.161566 | orchestrator | Saturday 27 September 2025 00:44:58 +0000 (0:00:00.348) 0:00:08.986 **** 2025-09-27 00:47:02.161579 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:02.161590 | orchestrator | 2025-09-27 00:47:02.161619 | orchestrator | TASK [rabbitmq : Ensuring config directories exist] **************************** 2025-09-27 00:47:02.161630 | orchestrator | Saturday 27 September 2025 00:44:58 +0000 (0:00:00.321) 0:00:09.307 **** 2025-09-27 00:47:02.161647 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.161663 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.161678 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.161697 | orchestrator | 2025-09-27 00:47:02.161708 | orchestrator | TASK [rabbitmq : Copying over config.json files for services] ****************** 2025-09-27 00:47:02.161719 | orchestrator | Saturday 27 September 2025 00:44:59 +0000 (0:00:00.801) 0:00:10.108 **** 2025-09-27 00:47:02.161745 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.161758 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.161771 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.161782 | orchestrator | 2025-09-27 00:47:02.161800 | orchestrator | TASK [rabbitmq : Copying over rabbitmq-env.conf] ******************************* 2025-09-27 00:47:02.161811 | orchestrator | Saturday 27 September 2025 00:45:00 +0000 (0:00:01.494) 0:00:11.603 **** 2025-09-27 00:47:02.161823 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-09-27 00:47:02.161833 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-09-27 00:47:02.161844 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq-env.conf.j2) 2025-09-27 00:47:02.161855 | orchestrator | 2025-09-27 00:47:02.161866 | orchestrator | TASK [rabbitmq : Copying over rabbitmq.conf] *********************************** 2025-09-27 00:47:02.161877 | orchestrator | Saturday 27 September 2025 00:45:02 +0000 (0:00:01.587) 0:00:13.191 **** 2025-09-27 00:47:02.161888 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-09-27 00:47:02.161898 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-09-27 00:47:02.161909 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/rabbitmq.conf.j2) 2025-09-27 00:47:02.161920 | orchestrator | 2025-09-27 00:47:02.161931 | orchestrator | TASK [rabbitmq : Copying over erl_inetrc] ************************************** 2025-09-27 00:47:02.161942 | orchestrator | Saturday 27 September 2025 00:45:05 +0000 (0:00:02.765) 0:00:15.957 **** 2025-09-27 00:47:02.161953 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-09-27 00:47:02.161963 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-09-27 00:47:02.161979 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/erl_inetrc.j2) 2025-09-27 00:47:02.161990 | orchestrator | 2025-09-27 00:47:02.162001 | orchestrator | TASK [rabbitmq : Copying over advanced.config] ********************************* 2025-09-27 00:47:02.162012 | orchestrator | Saturday 27 September 2025 00:45:07 +0000 (0:00:02.308) 0:00:18.266 **** 2025-09-27 00:47:02.162102 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-09-27 00:47:02.162114 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-09-27 00:47:02.162125 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/advanced.config.j2) 2025-09-27 00:47:02.162136 | orchestrator | 2025-09-27 00:47:02.162147 | orchestrator | TASK [rabbitmq : Copying over definitions.json] ******************************** 2025-09-27 00:47:02.162158 | orchestrator | Saturday 27 September 2025 00:45:10 +0000 (0:00:02.391) 0:00:20.657 **** 2025-09-27 00:47:02.162169 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-09-27 00:47:02.162180 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-09-27 00:47:02.162190 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/definitions.json.j2) 2025-09-27 00:47:02.162201 | orchestrator | 2025-09-27 00:47:02.162229 | orchestrator | TASK [rabbitmq : Copying over enabled_plugins] ********************************* 2025-09-27 00:47:02.162241 | orchestrator | Saturday 27 September 2025 00:45:11 +0000 (0:00:01.355) 0:00:22.013 **** 2025-09-27 00:47:02.162251 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-09-27 00:47:02.162262 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-09-27 00:47:02.162273 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/rabbitmq/templates/enabled_plugins.j2) 2025-09-27 00:47:02.162284 | orchestrator | 2025-09-27 00:47:02.162295 | orchestrator | TASK [rabbitmq : include_tasks] ************************************************ 2025-09-27 00:47:02.162305 | orchestrator | Saturday 27 September 2025 00:45:12 +0000 (0:00:01.187) 0:00:23.200 **** 2025-09-27 00:47:02.162316 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:02.162336 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:02.162347 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:02.162358 | orchestrator | 2025-09-27 00:47:02.162369 | orchestrator | TASK [rabbitmq : Check rabbitmq containers] ************************************ 2025-09-27 00:47:02.162380 | orchestrator | Saturday 27 September 2025 00:45:13 +0000 (0:00:00.617) 0:00:23.818 **** 2025-09-27 00:47:02.162392 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.162405 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.162433 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': 'rabbitmq', 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': 'zdd6geSBXefcI7IoHnP1U1fxtRWS3u5QtnPCvQTT', 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:47:02.162446 | orchestrator | 2025-09-27 00:47:02.162457 | orchestrator | TASK [rabbitmq : Creating rabbitmq volume] ************************************* 2025-09-27 00:47:02.162468 | orchestrator | Saturday 27 September 2025 00:45:14 +0000 (0:00:01.339) 0:00:25.157 **** 2025-09-27 00:47:02.162478 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:02.162489 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:02.162500 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:02.162517 | orchestrator | 2025-09-27 00:47:02.162529 | orchestrator | TASK [rabbitmq : Running RabbitMQ bootstrap container] ************************* 2025-09-27 00:47:02.162539 | orchestrator | Saturday 27 September 2025 00:45:15 +0000 (0:00:00.920) 0:00:26.078 **** 2025-09-27 00:47:02.162550 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:02.162561 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:02.162572 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:02.162583 | orchestrator | 2025-09-27 00:47:02.162593 | orchestrator | RUNNING HANDLER [rabbitmq : Restart rabbitmq container] ************************ 2025-09-27 00:47:02.162604 | orchestrator | Saturday 27 September 2025 00:45:21 +0000 (0:00:06.546) 0:00:32.625 **** 2025-09-27 00:47:02.162615 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:02.162626 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:02.162637 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:02.162648 | orchestrator | 2025-09-27 00:47:02.162658 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-09-27 00:47:02.162669 | orchestrator | 2025-09-27 00:47:02.162680 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-09-27 00:47:02.162691 | orchestrator | Saturday 27 September 2025 00:45:22 +0000 (0:00:00.431) 0:00:33.057 **** 2025-09-27 00:47:02.162702 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:02.162712 | orchestrator | 2025-09-27 00:47:02.162723 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-09-27 00:47:02.162734 | orchestrator | Saturday 27 September 2025 00:45:23 +0000 (0:00:00.675) 0:00:33.732 **** 2025-09-27 00:47:02.162745 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:02.162756 | orchestrator | 2025-09-27 00:47:02.162767 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-09-27 00:47:02.162778 | orchestrator | Saturday 27 September 2025 00:45:23 +0000 (0:00:00.247) 0:00:33.980 **** 2025-09-27 00:47:02.162788 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:02.162799 | orchestrator | 2025-09-27 00:47:02.162810 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-09-27 00:47:02.162821 | orchestrator | Saturday 27 September 2025 00:45:25 +0000 (0:00:02.002) 0:00:35.983 **** 2025-09-27 00:47:02.162832 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:02.162842 | orchestrator | 2025-09-27 00:47:02.162853 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-09-27 00:47:02.162864 | orchestrator | 2025-09-27 00:47:02.162875 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-09-27 00:47:02.162886 | orchestrator | Saturday 27 September 2025 00:46:20 +0000 (0:00:54.707) 0:01:30.690 **** 2025-09-27 00:47:02.162897 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:02.162908 | orchestrator | 2025-09-27 00:47:02.162918 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-09-27 00:47:02.162929 | orchestrator | Saturday 27 September 2025 00:46:20 +0000 (0:00:00.709) 0:01:31.400 **** 2025-09-27 00:47:02.162940 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:02.162951 | orchestrator | 2025-09-27 00:47:02.162962 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-09-27 00:47:02.162973 | orchestrator | Saturday 27 September 2025 00:46:21 +0000 (0:00:00.433) 0:01:31.833 **** 2025-09-27 00:47:02.162984 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:02.162995 | orchestrator | 2025-09-27 00:47:02.163006 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-09-27 00:47:02.163016 | orchestrator | Saturday 27 September 2025 00:46:23 +0000 (0:00:02.105) 0:01:33.939 **** 2025-09-27 00:47:02.163027 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:02.163038 | orchestrator | 2025-09-27 00:47:02.163049 | orchestrator | PLAY [Restart rabbitmq services] *********************************************** 2025-09-27 00:47:02.163059 | orchestrator | 2025-09-27 00:47:02.163070 | orchestrator | TASK [rabbitmq : Get info on RabbitMQ container] ******************************* 2025-09-27 00:47:02.163081 | orchestrator | Saturday 27 September 2025 00:46:39 +0000 (0:00:15.857) 0:01:49.796 **** 2025-09-27 00:47:02.163092 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:02.163108 | orchestrator | 2025-09-27 00:47:02.163119 | orchestrator | TASK [rabbitmq : Put RabbitMQ node into maintenance mode] ********************** 2025-09-27 00:47:02.163130 | orchestrator | Saturday 27 September 2025 00:46:39 +0000 (0:00:00.561) 0:01:50.357 **** 2025-09-27 00:47:02.163141 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:02.163152 | orchestrator | 2025-09-27 00:47:02.163163 | orchestrator | TASK [rabbitmq : Restart rabbitmq container] *********************************** 2025-09-27 00:47:02.163178 | orchestrator | Saturday 27 September 2025 00:46:39 +0000 (0:00:00.221) 0:01:50.579 **** 2025-09-27 00:47:02.163189 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:02.163200 | orchestrator | 2025-09-27 00:47:02.163239 | orchestrator | TASK [rabbitmq : Waiting for rabbitmq to start] ******************************** 2025-09-27 00:47:02.163256 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:01.597) 0:01:52.176 **** 2025-09-27 00:47:02.163267 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:02.163278 | orchestrator | 2025-09-27 00:47:02.163289 | orchestrator | PLAY [Apply rabbitmq post-configuration] *************************************** 2025-09-27 00:47:02.163300 | orchestrator | 2025-09-27 00:47:02.163311 | orchestrator | TASK [Include rabbitmq post-deploy.yml] **************************************** 2025-09-27 00:47:02.163322 | orchestrator | Saturday 27 September 2025 00:46:58 +0000 (0:00:16.534) 0:02:08.711 **** 2025-09-27 00:47:02.163333 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:47:02.163344 | orchestrator | 2025-09-27 00:47:02.163355 | orchestrator | TASK [rabbitmq : Enable all stable feature flags] ****************************** 2025-09-27 00:47:02.163366 | orchestrator | Saturday 27 September 2025 00:46:59 +0000 (0:00:01.022) 0:02:09.733 **** 2025-09-27 00:47:02.163376 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: 2025-09-27 00:47:02.163387 | orchestrator | enable_outward_rabbitmq_True 2025-09-27 00:47:02.163398 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: 2025-09-27 00:47:02.163409 | orchestrator | outward_rabbitmq_restart 2025-09-27 00:47:02.163420 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:02.163431 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:02.163441 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:02.163452 | orchestrator | 2025-09-27 00:47:02.163463 | orchestrator | PLAY [Apply role rabbitmq (outward)] ******************************************* 2025-09-27 00:47:02.163474 | orchestrator | skipping: no hosts matched 2025-09-27 00:47:02.163485 | orchestrator | 2025-09-27 00:47:02.163496 | orchestrator | PLAY [Restart rabbitmq (outward) services] ************************************* 2025-09-27 00:47:02.163506 | orchestrator | skipping: no hosts matched 2025-09-27 00:47:02.163517 | orchestrator | 2025-09-27 00:47:02.163528 | orchestrator | PLAY [Apply rabbitmq (outward) post-configuration] ***************************** 2025-09-27 00:47:02.163539 | orchestrator | skipping: no hosts matched 2025-09-27 00:47:02.163549 | orchestrator | 2025-09-27 00:47:02.163560 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:47:02.163571 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2025-09-27 00:47:02.163583 | orchestrator | testbed-node-0 : ok=23  changed=14  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2025-09-27 00:47:02.163594 | orchestrator | testbed-node-1 : ok=21  changed=14  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:47:02.163606 | orchestrator | testbed-node-2 : ok=21  changed=14  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:47:02.163616 | orchestrator | 2025-09-27 00:47:02.163627 | orchestrator | 2025-09-27 00:47:02.163638 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:47:02.163649 | orchestrator | Saturday 27 September 2025 00:47:01 +0000 (0:00:02.552) 0:02:12.285 **** 2025-09-27 00:47:02.163660 | orchestrator | =============================================================================== 2025-09-27 00:47:02.163680 | orchestrator | rabbitmq : Waiting for rabbitmq to start ------------------------------- 87.10s 2025-09-27 00:47:02.163691 | orchestrator | rabbitmq : Running RabbitMQ bootstrap container ------------------------- 6.55s 2025-09-27 00:47:02.163702 | orchestrator | rabbitmq : Restart rabbitmq container ----------------------------------- 5.71s 2025-09-27 00:47:02.163712 | orchestrator | Check RabbitMQ service -------------------------------------------------- 2.83s 2025-09-27 00:47:02.163723 | orchestrator | rabbitmq : Copying over rabbitmq.conf ----------------------------------- 2.77s 2025-09-27 00:47:02.163734 | orchestrator | rabbitmq : Enable all stable feature flags ------------------------------ 2.55s 2025-09-27 00:47:02.163745 | orchestrator | rabbitmq : Copying over advanced.config --------------------------------- 2.39s 2025-09-27 00:47:02.163756 | orchestrator | rabbitmq : Copying over erl_inetrc -------------------------------------- 2.31s 2025-09-27 00:47:02.163767 | orchestrator | rabbitmq : Get info on RabbitMQ container ------------------------------- 1.95s 2025-09-27 00:47:02.163778 | orchestrator | rabbitmq : Copying over rabbitmq-env.conf ------------------------------- 1.59s 2025-09-27 00:47:02.163788 | orchestrator | rabbitmq : Copying over config.json files for services ------------------ 1.50s 2025-09-27 00:47:02.163799 | orchestrator | rabbitmq : Copying over definitions.json -------------------------------- 1.36s 2025-09-27 00:47:02.163810 | orchestrator | rabbitmq : Check rabbitmq containers ------------------------------------ 1.34s 2025-09-27 00:47:02.163821 | orchestrator | rabbitmq : Copying over enabled_plugins --------------------------------- 1.19s 2025-09-27 00:47:02.163831 | orchestrator | Include rabbitmq post-deploy.yml ---------------------------------------- 1.02s 2025-09-27 00:47:02.163842 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 0.92s 2025-09-27 00:47:02.163853 | orchestrator | rabbitmq : Creating rabbitmq volume ------------------------------------- 0.92s 2025-09-27 00:47:02.163864 | orchestrator | rabbitmq : Put RabbitMQ node into maintenance mode ---------------------- 0.90s 2025-09-27 00:47:02.163874 | orchestrator | rabbitmq : Get container facts ------------------------------------------ 0.87s 2025-09-27 00:47:02.163885 | orchestrator | rabbitmq : Ensuring config directories exist ---------------------------- 0.80s 2025-09-27 00:47:02.163906 | orchestrator | 2025-09-27 00:47:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:05.191754 | orchestrator | 2025-09-27 00:47:05 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:05.192144 | orchestrator | 2025-09-27 00:47:05 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:05.196028 | orchestrator | 2025-09-27 00:47:05 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state STARTED 2025-09-27 00:47:05.196067 | orchestrator | 2025-09-27 00:47:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:08.226876 | orchestrator | 2025-09-27 00:47:08 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:08.227581 | orchestrator | 2025-09-27 00:47:08 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:08.229609 | orchestrator | 2025-09-27 00:47:08 | INFO  | Task 1d877c1d-aea7-49a7-ac1d-fd8e8254af58 is in state SUCCESS 2025-09-27 00:47:08.231019 | orchestrator | 2025-09-27 00:47:08.231057 | orchestrator | 2025-09-27 00:47:08.231249 | orchestrator | PLAY [Prepare all k3s nodes] *************************************************** 2025-09-27 00:47:08.231264 | orchestrator | 2025-09-27 00:47:08.231275 | orchestrator | TASK [k3s_prereq : Validating arguments against arg spec 'main' - Prerequisites] *** 2025-09-27 00:47:08.231287 | orchestrator | Saturday 27 September 2025 00:43:32 +0000 (0:00:00.129) 0:00:00.129 **** 2025-09-27 00:47:08.231298 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:47:08.231310 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:47:08.231321 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:47:08.231331 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.231342 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.231378 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.231390 | orchestrator | 2025-09-27 00:47:08.231400 | orchestrator | TASK [k3s_prereq : Set same timezone on every Server] ************************** 2025-09-27 00:47:08.231411 | orchestrator | Saturday 27 September 2025 00:43:32 +0000 (0:00:00.571) 0:00:00.701 **** 2025-09-27 00:47:08.231422 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.231433 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.231444 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.231455 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.231465 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.231476 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.231486 | orchestrator | 2025-09-27 00:47:08.231497 | orchestrator | TASK [k3s_prereq : Set SELinux to disabled state] ****************************** 2025-09-27 00:47:08.231508 | orchestrator | Saturday 27 September 2025 00:43:33 +0000 (0:00:00.507) 0:00:01.209 **** 2025-09-27 00:47:08.231518 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.231529 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.231540 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.231551 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.231562 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.231572 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.231583 | orchestrator | 2025-09-27 00:47:08.231593 | orchestrator | TASK [k3s_prereq : Enable IPv4 forwarding] ************************************* 2025-09-27 00:47:08.231604 | orchestrator | Saturday 27 September 2025 00:43:33 +0000 (0:00:00.556) 0:00:01.766 **** 2025-09-27 00:47:08.231614 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:47:08.231625 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:47:08.231635 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.231646 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.231656 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:47:08.231667 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.231677 | orchestrator | 2025-09-27 00:47:08.231688 | orchestrator | TASK [k3s_prereq : Enable IPv6 forwarding] ************************************* 2025-09-27 00:47:08.231699 | orchestrator | Saturday 27 September 2025 00:43:36 +0000 (0:00:02.923) 0:00:04.689 **** 2025-09-27 00:47:08.231709 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:47:08.231720 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:47:08.231730 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:47:08.231741 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.231751 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.231762 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.231772 | orchestrator | 2025-09-27 00:47:08.231783 | orchestrator | TASK [k3s_prereq : Enable IPv6 router advertisements] ************************** 2025-09-27 00:47:08.231793 | orchestrator | Saturday 27 September 2025 00:43:37 +0000 (0:00:01.023) 0:00:05.713 **** 2025-09-27 00:47:08.231804 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:47:08.231814 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:47:08.231825 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:47:08.231836 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.231847 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.231859 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.231872 | orchestrator | 2025-09-27 00:47:08.231885 | orchestrator | TASK [k3s_prereq : Add br_netfilter to /etc/modules-load.d/] ******************* 2025-09-27 00:47:08.231897 | orchestrator | Saturday 27 September 2025 00:43:39 +0000 (0:00:01.439) 0:00:07.152 **** 2025-09-27 00:47:08.231909 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.231921 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.231934 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.231946 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.231958 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.231970 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.231982 | orchestrator | 2025-09-27 00:47:08.231994 | orchestrator | TASK [k3s_prereq : Load br_netfilter] ****************************************** 2025-09-27 00:47:08.232015 | orchestrator | Saturday 27 September 2025 00:43:40 +0000 (0:00:00.983) 0:00:08.136 **** 2025-09-27 00:47:08.232027 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.232040 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.232052 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.232064 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.232076 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.232088 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.232100 | orchestrator | 2025-09-27 00:47:08.232113 | orchestrator | TASK [k3s_prereq : Set bridge-nf-call-iptables (just to be sure)] ************** 2025-09-27 00:47:08.232141 | orchestrator | Saturday 27 September 2025 00:43:41 +0000 (0:00:01.601) 0:00:09.737 **** 2025-09-27 00:47:08.232155 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-27 00:47:08.232167 | orchestrator | skipping: [testbed-node-3] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-27 00:47:08.232180 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.232193 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-27 00:47:08.232204 | orchestrator | skipping: [testbed-node-4] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-27 00:47:08.232247 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-27 00:47:08.232258 | orchestrator | skipping: [testbed-node-5] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-27 00:47:08.232269 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.232279 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-27 00:47:08.232290 | orchestrator | skipping: [testbed-node-0] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-27 00:47:08.232312 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.232323 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-27 00:47:08.232334 | orchestrator | skipping: [testbed-node-1] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-27 00:47:08.232345 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.232355 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.232366 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-iptables)  2025-09-27 00:47:08.232377 | orchestrator | skipping: [testbed-node-2] => (item=net.bridge.bridge-nf-call-ip6tables)  2025-09-27 00:47:08.232387 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.232398 | orchestrator | 2025-09-27 00:47:08.232409 | orchestrator | TASK [k3s_prereq : Add /usr/local/bin to sudo secure_path] ********************* 2025-09-27 00:47:08.232419 | orchestrator | Saturday 27 September 2025 00:43:42 +0000 (0:00:00.556) 0:00:10.294 **** 2025-09-27 00:47:08.232430 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.232440 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.232451 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.232462 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.232472 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.232483 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.232493 | orchestrator | 2025-09-27 00:47:08.232504 | orchestrator | TASK [k3s_download : Validating arguments against arg spec 'main' - Manage the downloading of K3S binaries] *** 2025-09-27 00:47:08.232516 | orchestrator | Saturday 27 September 2025 00:43:43 +0000 (0:00:01.136) 0:00:11.430 **** 2025-09-27 00:47:08.232526 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:47:08.232537 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:47:08.232548 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:47:08.232558 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.232569 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.232580 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.232590 | orchestrator | 2025-09-27 00:47:08.232601 | orchestrator | TASK [k3s_download : Download k3s binary x64] ********************************** 2025-09-27 00:47:08.232612 | orchestrator | Saturday 27 September 2025 00:43:44 +0000 (0:00:01.152) 0:00:12.582 **** 2025-09-27 00:47:08.232630 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:47:08.232641 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:47:08.232651 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:47:08.232662 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.232672 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.232683 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.232693 | orchestrator | 2025-09-27 00:47:08.232704 | orchestrator | TASK [k3s_download : Download k3s binary arm64] ******************************** 2025-09-27 00:47:08.232715 | orchestrator | Saturday 27 September 2025 00:43:50 +0000 (0:00:06.015) 0:00:18.598 **** 2025-09-27 00:47:08.232725 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.232736 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.232747 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.232757 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.232768 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.232778 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.232789 | orchestrator | 2025-09-27 00:47:08.232800 | orchestrator | TASK [k3s_download : Download k3s binary armhf] ******************************** 2025-09-27 00:47:08.232811 | orchestrator | Saturday 27 September 2025 00:43:52 +0000 (0:00:01.891) 0:00:20.490 **** 2025-09-27 00:47:08.232821 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.232832 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.232842 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.232853 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.232863 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.232874 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.232885 | orchestrator | 2025-09-27 00:47:08.232895 | orchestrator | TASK [k3s_custom_registries : Validating arguments against arg spec 'main' - Configure the use of a custom container registry] *** 2025-09-27 00:47:08.232908 | orchestrator | Saturday 27 September 2025 00:43:54 +0000 (0:00:02.139) 0:00:22.630 **** 2025-09-27 00:47:08.232918 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:47:08.232929 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:47:08.232940 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:47:08.232950 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.232961 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.232971 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.232982 | orchestrator | 2025-09-27 00:47:08.232993 | orchestrator | TASK [k3s_custom_registries : Create directory /etc/rancher/k3s] *************** 2025-09-27 00:47:08.233003 | orchestrator | Saturday 27 September 2025 00:43:55 +0000 (0:00:00.818) 0:00:23.448 **** 2025-09-27 00:47:08.233014 | orchestrator | changed: [testbed-node-5] => (item=rancher) 2025-09-27 00:47:08.233025 | orchestrator | changed: [testbed-node-3] => (item=rancher) 2025-09-27 00:47:08.233035 | orchestrator | changed: [testbed-node-4] => (item=rancher) 2025-09-27 00:47:08.233046 | orchestrator | changed: [testbed-node-0] => (item=rancher) 2025-09-27 00:47:08.233057 | orchestrator | changed: [testbed-node-1] => (item=rancher) 2025-09-27 00:47:08.233073 | orchestrator | changed: [testbed-node-5] => (item=rancher/k3s) 2025-09-27 00:47:08.233084 | orchestrator | changed: [testbed-node-3] => (item=rancher/k3s) 2025-09-27 00:47:08.233095 | orchestrator | changed: [testbed-node-4] => (item=rancher/k3s) 2025-09-27 00:47:08.233105 | orchestrator | changed: [testbed-node-2] => (item=rancher) 2025-09-27 00:47:08.233116 | orchestrator | changed: [testbed-node-0] => (item=rancher/k3s) 2025-09-27 00:47:08.233126 | orchestrator | changed: [testbed-node-1] => (item=rancher/k3s) 2025-09-27 00:47:08.233137 | orchestrator | changed: [testbed-node-2] => (item=rancher/k3s) 2025-09-27 00:47:08.233147 | orchestrator | 2025-09-27 00:47:08.233158 | orchestrator | TASK [k3s_custom_registries : Insert registries into /etc/rancher/k3s/registries.yaml] *** 2025-09-27 00:47:08.233169 | orchestrator | Saturday 27 September 2025 00:43:57 +0000 (0:00:01.621) 0:00:25.069 **** 2025-09-27 00:47:08.233180 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:47:08.233190 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:47:08.233201 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:47:08.233236 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.233247 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.233257 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.233268 | orchestrator | 2025-09-27 00:47:08.233287 | orchestrator | PLAY [Deploy k3s master nodes] ************************************************* 2025-09-27 00:47:08.233298 | orchestrator | 2025-09-27 00:47:08.233309 | orchestrator | TASK [k3s_server : Validating arguments against arg spec 'main' - Setup k3s servers] *** 2025-09-27 00:47:08.233320 | orchestrator | Saturday 27 September 2025 00:43:58 +0000 (0:00:01.581) 0:00:26.651 **** 2025-09-27 00:47:08.233345 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.233356 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.233377 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.233388 | orchestrator | 2025-09-27 00:47:08.233399 | orchestrator | TASK [k3s_server : Stop k3s-init] ********************************************** 2025-09-27 00:47:08.233409 | orchestrator | Saturday 27 September 2025 00:44:00 +0000 (0:00:01.202) 0:00:27.853 **** 2025-09-27 00:47:08.233420 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.233430 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.233441 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.233452 | orchestrator | 2025-09-27 00:47:08.233462 | orchestrator | TASK [k3s_server : Stop k3s] *************************************************** 2025-09-27 00:47:08.233473 | orchestrator | Saturday 27 September 2025 00:44:01 +0000 (0:00:01.480) 0:00:29.333 **** 2025-09-27 00:47:08.233484 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.233494 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.233505 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.233515 | orchestrator | 2025-09-27 00:47:08.233526 | orchestrator | TASK [k3s_server : Clean previous runs of k3s-init] **************************** 2025-09-27 00:47:08.233537 | orchestrator | Saturday 27 September 2025 00:44:02 +0000 (0:00:00.836) 0:00:30.170 **** 2025-09-27 00:47:08.233548 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.233558 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.233569 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.233815 | orchestrator | 2025-09-27 00:47:08.233844 | orchestrator | TASK [k3s_server : Deploy K3s http_proxy conf] ********************************* 2025-09-27 00:47:08.233857 | orchestrator | Saturday 27 September 2025 00:44:03 +0000 (0:00:01.305) 0:00:31.475 **** 2025-09-27 00:47:08.233869 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.233880 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.233891 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.233902 | orchestrator | 2025-09-27 00:47:08.233913 | orchestrator | TASK [k3s_server : Create /etc/rancher/k3s directory] ************************** 2025-09-27 00:47:08.233924 | orchestrator | Saturday 27 September 2025 00:44:04 +0000 (0:00:00.495) 0:00:31.971 **** 2025-09-27 00:47:08.233934 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.233945 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.233956 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.233966 | orchestrator | 2025-09-27 00:47:08.233977 | orchestrator | TASK [k3s_server : Create custom resolv.conf for k3s] ************************** 2025-09-27 00:47:08.233988 | orchestrator | Saturday 27 September 2025 00:44:05 +0000 (0:00:00.930) 0:00:32.902 **** 2025-09-27 00:47:08.233998 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.234009 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.234061 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.234072 | orchestrator | 2025-09-27 00:47:08.234083 | orchestrator | TASK [k3s_server : Deploy vip manifest] **************************************** 2025-09-27 00:47:08.234094 | orchestrator | Saturday 27 September 2025 00:44:06 +0000 (0:00:01.456) 0:00:34.358 **** 2025-09-27 00:47:08.234105 | orchestrator | included: /ansible/roles/k3s_server/tasks/vip.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:47:08.234117 | orchestrator | 2025-09-27 00:47:08.234127 | orchestrator | TASK [k3s_server : Set _kube_vip_bgp_peers fact] ******************************* 2025-09-27 00:47:08.234138 | orchestrator | Saturday 27 September 2025 00:44:07 +0000 (0:00:00.988) 0:00:35.347 **** 2025-09-27 00:47:08.234149 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.234186 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.234198 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.234235 | orchestrator | 2025-09-27 00:47:08.234247 | orchestrator | TASK [k3s_server : Create manifests directory on first master] ***************** 2025-09-27 00:47:08.234258 | orchestrator | Saturday 27 September 2025 00:44:10 +0000 (0:00:03.051) 0:00:38.398 **** 2025-09-27 00:47:08.234268 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.234279 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.234290 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.234300 | orchestrator | 2025-09-27 00:47:08.234311 | orchestrator | TASK [k3s_server : Download vip rbac manifest to first master] ***************** 2025-09-27 00:47:08.234322 | orchestrator | Saturday 27 September 2025 00:44:11 +0000 (0:00:00.982) 0:00:39.380 **** 2025-09-27 00:47:08.234332 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.234342 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.234353 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.234363 | orchestrator | 2025-09-27 00:47:08.234374 | orchestrator | TASK [k3s_server : Copy vip manifest to first master] ************************** 2025-09-27 00:47:08.234384 | orchestrator | Saturday 27 September 2025 00:44:12 +0000 (0:00:01.179) 0:00:40.560 **** 2025-09-27 00:47:08.234395 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.234405 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.234416 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.234427 | orchestrator | 2025-09-27 00:47:08.234453 | orchestrator | TASK [k3s_server : Deploy metallb manifest] ************************************ 2025-09-27 00:47:08.234465 | orchestrator | Saturday 27 September 2025 00:44:14 +0000 (0:00:01.627) 0:00:42.187 **** 2025-09-27 00:47:08.234476 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.234486 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.234497 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.234507 | orchestrator | 2025-09-27 00:47:08.234517 | orchestrator | TASK [k3s_server : Deploy kube-vip manifest] *********************************** 2025-09-27 00:47:08.234528 | orchestrator | Saturday 27 September 2025 00:44:14 +0000 (0:00:00.468) 0:00:42.655 **** 2025-09-27 00:47:08.234539 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.234549 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.234560 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.234570 | orchestrator | 2025-09-27 00:47:08.234581 | orchestrator | TASK [k3s_server : Init cluster inside the transient k3s-init service] ********* 2025-09-27 00:47:08.234592 | orchestrator | Saturday 27 September 2025 00:44:15 +0000 (0:00:00.352) 0:00:43.007 **** 2025-09-27 00:47:08.234602 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.234613 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.234623 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.234634 | orchestrator | 2025-09-27 00:47:08.234674 | orchestrator | TASK [k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails)] *** 2025-09-27 00:47:08.234686 | orchestrator | Saturday 27 September 2025 00:44:18 +0000 (0:00:02.960) 0:00:45.968 **** 2025-09-27 00:47:08.234697 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2025-09-27 00:47:08.234710 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2025-09-27 00:47:08.234721 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (20 retries left). 2025-09-27 00:47:08.234731 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2025-09-27 00:47:08.234742 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2025-09-27 00:47:08.234753 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (19 retries left). 2025-09-27 00:47:08.234771 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2025-09-27 00:47:08.234782 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2025-09-27 00:47:08.234793 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (18 retries left). 2025-09-27 00:47:08.234804 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2025-09-27 00:47:08.234814 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2025-09-27 00:47:08.234825 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Verify that all nodes actually joined (check k3s-init.service if this fails) (17 retries left). 2025-09-27 00:47:08.234836 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Verify that all nodes actually joined (check k3s-init.service if this fails) (16 retries left). 2025-09-27 00:47:08.234847 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Verify that all nodes actually joined (check k3s-init.service if this fails) (16 retries left). 2025-09-27 00:47:08.234857 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.234868 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.234879 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.234889 | orchestrator | 2025-09-27 00:47:08.234900 | orchestrator | TASK [k3s_server : Save logs of k3s-init.service] ****************************** 2025-09-27 00:47:08.234911 | orchestrator | Saturday 27 September 2025 00:45:12 +0000 (0:00:54.593) 0:01:40.561 **** 2025-09-27 00:47:08.234922 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.234932 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.234943 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.234954 | orchestrator | 2025-09-27 00:47:08.234964 | orchestrator | TASK [k3s_server : Kill the temporary service used for initialization] ********* 2025-09-27 00:47:08.234975 | orchestrator | Saturday 27 September 2025 00:45:13 +0000 (0:00:00.326) 0:01:40.887 **** 2025-09-27 00:47:08.234985 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.234996 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.235006 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.235017 | orchestrator | 2025-09-27 00:47:08.235027 | orchestrator | TASK [k3s_server : Copy K3s service file] ************************************** 2025-09-27 00:47:08.235038 | orchestrator | Saturday 27 September 2025 00:45:14 +0000 (0:00:01.045) 0:01:41.933 **** 2025-09-27 00:47:08.235048 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.235059 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.235069 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.235080 | orchestrator | 2025-09-27 00:47:08.235091 | orchestrator | TASK [k3s_server : Enable and check K3s service] ******************************* 2025-09-27 00:47:08.235101 | orchestrator | Saturday 27 September 2025 00:45:15 +0000 (0:00:01.259) 0:01:43.192 **** 2025-09-27 00:47:08.235112 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.235122 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.235133 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.235144 | orchestrator | 2025-09-27 00:47:08.235839 | orchestrator | TASK [k3s_server : Wait for node-token] **************************************** 2025-09-27 00:47:08.235857 | orchestrator | Saturday 27 September 2025 00:45:41 +0000 (0:00:26.171) 0:02:09.363 **** 2025-09-27 00:47:08.235868 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.235879 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.235890 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.235900 | orchestrator | 2025-09-27 00:47:08.235911 | orchestrator | TASK [k3s_server : Register node-token file access mode] *********************** 2025-09-27 00:47:08.235921 | orchestrator | Saturday 27 September 2025 00:45:42 +0000 (0:00:00.701) 0:02:10.065 **** 2025-09-27 00:47:08.235941 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.235952 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.235962 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.235973 | orchestrator | 2025-09-27 00:47:08.235983 | orchestrator | TASK [k3s_server : Change file access node-token] ****************************** 2025-09-27 00:47:08.235994 | orchestrator | Saturday 27 September 2025 00:45:42 +0000 (0:00:00.618) 0:02:10.683 **** 2025-09-27 00:47:08.236017 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.236028 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.236039 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.236050 | orchestrator | 2025-09-27 00:47:08.236060 | orchestrator | TASK [k3s_server : Read node-token from master] ******************************** 2025-09-27 00:47:08.236071 | orchestrator | Saturday 27 September 2025 00:45:43 +0000 (0:00:00.646) 0:02:11.330 **** 2025-09-27 00:47:08.236082 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.236092 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.236103 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.236114 | orchestrator | 2025-09-27 00:47:08.236124 | orchestrator | TASK [k3s_server : Store Master node-token] ************************************ 2025-09-27 00:47:08.236135 | orchestrator | Saturday 27 September 2025 00:45:44 +0000 (0:00:01.104) 0:02:12.434 **** 2025-09-27 00:47:08.236145 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.236161 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.236172 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.236182 | orchestrator | 2025-09-27 00:47:08.236193 | orchestrator | TASK [k3s_server : Restore node-token file access] ***************************** 2025-09-27 00:47:08.236203 | orchestrator | Saturday 27 September 2025 00:45:45 +0000 (0:00:00.372) 0:02:12.807 **** 2025-09-27 00:47:08.236244 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.236255 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.236266 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.236276 | orchestrator | 2025-09-27 00:47:08.236287 | orchestrator | TASK [k3s_server : Create directory .kube] ************************************* 2025-09-27 00:47:08.236297 | orchestrator | Saturday 27 September 2025 00:45:45 +0000 (0:00:00.748) 0:02:13.556 **** 2025-09-27 00:47:08.236308 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.236318 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.236329 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.236340 | orchestrator | 2025-09-27 00:47:08.236350 | orchestrator | TASK [k3s_server : Copy config file to user home directory] ******************** 2025-09-27 00:47:08.236361 | orchestrator | Saturday 27 September 2025 00:45:46 +0000 (0:00:00.719) 0:02:14.276 **** 2025-09-27 00:47:08.236371 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.236382 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.236392 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.236403 | orchestrator | 2025-09-27 00:47:08.236413 | orchestrator | TASK [k3s_server : Configure kubectl cluster to https://192.168.16.8:6443] ***** 2025-09-27 00:47:08.236424 | orchestrator | Saturday 27 September 2025 00:45:47 +0000 (0:00:00.975) 0:02:15.251 **** 2025-09-27 00:47:08.236435 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:47:08.236445 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:47:08.236456 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:47:08.236466 | orchestrator | 2025-09-27 00:47:08.236477 | orchestrator | TASK [k3s_server : Create kubectl symlink] ************************************* 2025-09-27 00:47:08.236488 | orchestrator | Saturday 27 September 2025 00:45:48 +0000 (0:00:00.804) 0:02:16.056 **** 2025-09-27 00:47:08.236498 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.236509 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.236520 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.236530 | orchestrator | 2025-09-27 00:47:08.236541 | orchestrator | TASK [k3s_server : Create crictl symlink] ************************************** 2025-09-27 00:47:08.236551 | orchestrator | Saturday 27 September 2025 00:45:48 +0000 (0:00:00.270) 0:02:16.326 **** 2025-09-27 00:47:08.236562 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.236572 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.236590 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.236601 | orchestrator | 2025-09-27 00:47:08.236611 | orchestrator | TASK [k3s_server : Get contents of manifests folder] *************************** 2025-09-27 00:47:08.236622 | orchestrator | Saturday 27 September 2025 00:45:48 +0000 (0:00:00.242) 0:02:16.569 **** 2025-09-27 00:47:08.236633 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.236643 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.236654 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.236665 | orchestrator | 2025-09-27 00:47:08.236675 | orchestrator | TASK [k3s_server : Get sub dirs of manifests folder] *************************** 2025-09-27 00:47:08.236686 | orchestrator | Saturday 27 September 2025 00:45:49 +0000 (0:00:00.858) 0:02:17.428 **** 2025-09-27 00:47:08.236696 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.236707 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.236717 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.236728 | orchestrator | 2025-09-27 00:47:08.236739 | orchestrator | TASK [k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start] *** 2025-09-27 00:47:08.236750 | orchestrator | Saturday 27 September 2025 00:45:50 +0000 (0:00:00.624) 0:02:18.052 **** 2025-09-27 00:47:08.236761 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2025-09-27 00:47:08.236771 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2025-09-27 00:47:08.236782 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/rolebindings.yaml) 2025-09-27 00:47:08.236793 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2025-09-27 00:47:08.236803 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2025-09-27 00:47:08.236814 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/local-storage.yaml) 2025-09-27 00:47:08.236824 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2025-09-27 00:47:08.236835 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2025-09-27 00:47:08.236846 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/coredns.yaml) 2025-09-27 00:47:08.236856 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2025-09-27 00:47:08.236867 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip.yaml) 2025-09-27 00:47:08.236884 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2025-09-27 00:47:08.236895 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2025-09-27 00:47:08.236906 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/vip-rbac.yaml) 2025-09-27 00:47:08.236916 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2025-09-27 00:47:08.236927 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/runtimes.yaml) 2025-09-27 00:47:08.236943 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2025-09-27 00:47:08.236953 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2025-09-27 00:47:08.236964 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/ccm.yaml) 2025-09-27 00:47:08.236975 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/rancher/k3s/server/manifests/metrics-server) 2025-09-27 00:47:08.236985 | orchestrator | 2025-09-27 00:47:08.236996 | orchestrator | PLAY [Deploy k3s worker nodes] ************************************************* 2025-09-27 00:47:08.237007 | orchestrator | 2025-09-27 00:47:08.237017 | orchestrator | TASK [k3s_agent : Validating arguments against arg spec 'main' - Setup k3s agents] *** 2025-09-27 00:47:08.237034 | orchestrator | Saturday 27 September 2025 00:45:53 +0000 (0:00:03.246) 0:02:21.299 **** 2025-09-27 00:47:08.237045 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:47:08.237055 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:47:08.237066 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:47:08.237077 | orchestrator | 2025-09-27 00:47:08.237087 | orchestrator | TASK [k3s_agent : Check if system is PXE-booted] ******************************* 2025-09-27 00:47:08.237098 | orchestrator | Saturday 27 September 2025 00:45:54 +0000 (0:00:00.763) 0:02:22.062 **** 2025-09-27 00:47:08.237108 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:47:08.237119 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:47:08.237129 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:47:08.237140 | orchestrator | 2025-09-27 00:47:08.237151 | orchestrator | TASK [k3s_agent : Set fact for PXE-booted system] ****************************** 2025-09-27 00:47:08.237161 | orchestrator | Saturday 27 September 2025 00:45:54 +0000 (0:00:00.722) 0:02:22.785 **** 2025-09-27 00:47:08.237172 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:47:08.237182 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:47:08.237193 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:47:08.237203 | orchestrator | 2025-09-27 00:47:08.237268 | orchestrator | TASK [k3s_agent : Include http_proxy configuration tasks] ********************** 2025-09-27 00:47:08.237280 | orchestrator | Saturday 27 September 2025 00:45:55 +0000 (0:00:00.556) 0:02:23.341 **** 2025-09-27 00:47:08.237291 | orchestrator | included: /ansible/roles/k3s_agent/tasks/http_proxy.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:47:08.237302 | orchestrator | 2025-09-27 00:47:08.237313 | orchestrator | TASK [k3s_agent : Create k3s-node.service.d directory] ************************* 2025-09-27 00:47:08.237323 | orchestrator | Saturday 27 September 2025 00:45:56 +0000 (0:00:00.797) 0:02:24.139 **** 2025-09-27 00:47:08.237334 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.237345 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.237355 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.237366 | orchestrator | 2025-09-27 00:47:08.237377 | orchestrator | TASK [k3s_agent : Copy K3s http_proxy conf file] ******************************* 2025-09-27 00:47:08.237387 | orchestrator | Saturday 27 September 2025 00:45:56 +0000 (0:00:00.303) 0:02:24.443 **** 2025-09-27 00:47:08.237398 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.237409 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.237419 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.237430 | orchestrator | 2025-09-27 00:47:08.237441 | orchestrator | TASK [k3s_agent : Deploy K3s http_proxy conf] ********************************** 2025-09-27 00:47:08.237451 | orchestrator | Saturday 27 September 2025 00:45:56 +0000 (0:00:00.327) 0:02:24.771 **** 2025-09-27 00:47:08.237462 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.237473 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.237484 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.237494 | orchestrator | 2025-09-27 00:47:08.237505 | orchestrator | TASK [k3s_agent : Create /etc/rancher/k3s directory] *************************** 2025-09-27 00:47:08.237516 | orchestrator | Saturday 27 September 2025 00:45:57 +0000 (0:00:00.335) 0:02:25.106 **** 2025-09-27 00:47:08.237526 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:47:08.237537 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:47:08.237548 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:47:08.237558 | orchestrator | 2025-09-27 00:47:08.237569 | orchestrator | TASK [k3s_agent : Create custom resolv.conf for k3s] *************************** 2025-09-27 00:47:08.237579 | orchestrator | Saturday 27 September 2025 00:45:57 +0000 (0:00:00.667) 0:02:25.774 **** 2025-09-27 00:47:08.237590 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:47:08.237601 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:47:08.237611 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:47:08.237622 | orchestrator | 2025-09-27 00:47:08.237633 | orchestrator | TASK [k3s_agent : Configure the k3s service] *********************************** 2025-09-27 00:47:08.237643 | orchestrator | Saturday 27 September 2025 00:45:59 +0000 (0:00:01.398) 0:02:27.173 **** 2025-09-27 00:47:08.237663 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:47:08.237673 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:47:08.237684 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:47:08.237695 | orchestrator | 2025-09-27 00:47:08.237705 | orchestrator | TASK [k3s_agent : Manage k3s service] ****************************************** 2025-09-27 00:47:08.237716 | orchestrator | Saturday 27 September 2025 00:46:00 +0000 (0:00:01.394) 0:02:28.567 **** 2025-09-27 00:47:08.237727 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:47:08.237737 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:47:08.237748 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:47:08.237758 | orchestrator | 2025-09-27 00:47:08.237769 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2025-09-27 00:47:08.237780 | orchestrator | 2025-09-27 00:47:08.237797 | orchestrator | TASK [Get home directory of operator user] ************************************* 2025-09-27 00:47:08.237809 | orchestrator | Saturday 27 September 2025 00:46:12 +0000 (0:00:11.827) 0:02:40.395 **** 2025-09-27 00:47:08.237819 | orchestrator | ok: [testbed-manager] 2025-09-27 00:47:08.237830 | orchestrator | 2025-09-27 00:47:08.237841 | orchestrator | TASK [Create .kube directory] ************************************************** 2025-09-27 00:47:08.237851 | orchestrator | Saturday 27 September 2025 00:46:13 +0000 (0:00:00.832) 0:02:41.227 **** 2025-09-27 00:47:08.237862 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.237873 | orchestrator | 2025-09-27 00:47:08.237883 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2025-09-27 00:47:08.237894 | orchestrator | Saturday 27 September 2025 00:46:13 +0000 (0:00:00.318) 0:02:41.546 **** 2025-09-27 00:47:08.237905 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2025-09-27 00:47:08.237916 | orchestrator | 2025-09-27 00:47:08.237931 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2025-09-27 00:47:08.237942 | orchestrator | Saturday 27 September 2025 00:46:14 +0000 (0:00:00.486) 0:02:42.033 **** 2025-09-27 00:47:08.237953 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.237964 | orchestrator | 2025-09-27 00:47:08.237974 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2025-09-27 00:47:08.237985 | orchestrator | Saturday 27 September 2025 00:46:15 +0000 (0:00:00.944) 0:02:42.977 **** 2025-09-27 00:47:08.237996 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.238006 | orchestrator | 2025-09-27 00:47:08.238073 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2025-09-27 00:47:08.238088 | orchestrator | Saturday 27 September 2025 00:46:15 +0000 (0:00:00.429) 0:02:43.407 **** 2025-09-27 00:47:08.238110 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-27 00:47:08.238121 | orchestrator | 2025-09-27 00:47:08.238132 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2025-09-27 00:47:08.238143 | orchestrator | Saturday 27 September 2025 00:46:16 +0000 (0:00:01.260) 0:02:44.667 **** 2025-09-27 00:47:08.238153 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-27 00:47:08.238164 | orchestrator | 2025-09-27 00:47:08.238175 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2025-09-27 00:47:08.238185 | orchestrator | Saturday 27 September 2025 00:46:17 +0000 (0:00:00.731) 0:02:45.399 **** 2025-09-27 00:47:08.238196 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.238224 | orchestrator | 2025-09-27 00:47:08.238235 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2025-09-27 00:47:08.238246 | orchestrator | Saturday 27 September 2025 00:46:17 +0000 (0:00:00.341) 0:02:45.741 **** 2025-09-27 00:47:08.238257 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.238267 | orchestrator | 2025-09-27 00:47:08.238278 | orchestrator | PLAY [Apply role kubectl] ****************************************************** 2025-09-27 00:47:08.238289 | orchestrator | 2025-09-27 00:47:08.238299 | orchestrator | TASK [kubectl : Gather variables for each operating system] ******************** 2025-09-27 00:47:08.238310 | orchestrator | Saturday 27 September 2025 00:46:18 +0000 (0:00:00.518) 0:02:46.259 **** 2025-09-27 00:47:08.238321 | orchestrator | ok: [testbed-manager] 2025-09-27 00:47:08.238342 | orchestrator | 2025-09-27 00:47:08.238353 | orchestrator | TASK [kubectl : Include distribution specific install tasks] ******************* 2025-09-27 00:47:08.238363 | orchestrator | Saturday 27 September 2025 00:46:18 +0000 (0:00:00.116) 0:02:46.376 **** 2025-09-27 00:47:08.238374 | orchestrator | included: /ansible/roles/kubectl/tasks/install-Debian-family.yml for testbed-manager 2025-09-27 00:47:08.238384 | orchestrator | 2025-09-27 00:47:08.238395 | orchestrator | TASK [kubectl : Remove old architecture-dependent repository] ****************** 2025-09-27 00:47:08.238406 | orchestrator | Saturday 27 September 2025 00:46:18 +0000 (0:00:00.185) 0:02:46.562 **** 2025-09-27 00:47:08.238416 | orchestrator | ok: [testbed-manager] 2025-09-27 00:47:08.238427 | orchestrator | 2025-09-27 00:47:08.238437 | orchestrator | TASK [kubectl : Install apt-transport-https package] *************************** 2025-09-27 00:47:08.238448 | orchestrator | Saturday 27 September 2025 00:46:19 +0000 (0:00:00.647) 0:02:47.209 **** 2025-09-27 00:47:08.238459 | orchestrator | ok: [testbed-manager] 2025-09-27 00:47:08.238469 | orchestrator | 2025-09-27 00:47:08.238480 | orchestrator | TASK [kubectl : Add repository gpg key] **************************************** 2025-09-27 00:47:08.238490 | orchestrator | Saturday 27 September 2025 00:46:20 +0000 (0:00:01.263) 0:02:48.473 **** 2025-09-27 00:47:08.238501 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.238512 | orchestrator | 2025-09-27 00:47:08.238522 | orchestrator | TASK [kubectl : Set permissions of gpg key] ************************************ 2025-09-27 00:47:08.238533 | orchestrator | Saturday 27 September 2025 00:46:21 +0000 (0:00:00.882) 0:02:49.355 **** 2025-09-27 00:47:08.238543 | orchestrator | ok: [testbed-manager] 2025-09-27 00:47:08.238554 | orchestrator | 2025-09-27 00:47:08.238565 | orchestrator | TASK [kubectl : Add repository Debian] ***************************************** 2025-09-27 00:47:08.238575 | orchestrator | Saturday 27 September 2025 00:46:22 +0000 (0:00:00.569) 0:02:49.924 **** 2025-09-27 00:47:08.238586 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.238597 | orchestrator | 2025-09-27 00:47:08.238607 | orchestrator | TASK [kubectl : Install required packages] ************************************* 2025-09-27 00:47:08.238618 | orchestrator | Saturday 27 September 2025 00:46:28 +0000 (0:00:06.143) 0:02:56.068 **** 2025-09-27 00:47:08.238629 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.238639 | orchestrator | 2025-09-27 00:47:08.238650 | orchestrator | TASK [kubectl : Remove kubectl symlink] **************************************** 2025-09-27 00:47:08.238660 | orchestrator | Saturday 27 September 2025 00:46:40 +0000 (0:00:12.236) 0:03:08.305 **** 2025-09-27 00:47:08.238671 | orchestrator | ok: [testbed-manager] 2025-09-27 00:47:08.238682 | orchestrator | 2025-09-27 00:47:08.238693 | orchestrator | PLAY [Run post actions on master nodes] **************************************** 2025-09-27 00:47:08.238703 | orchestrator | 2025-09-27 00:47:08.238714 | orchestrator | TASK [k3s_server_post : Validating arguments against arg spec 'main' - Configure k3s cluster] *** 2025-09-27 00:47:08.238724 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:00.522) 0:03:08.828 **** 2025-09-27 00:47:08.238735 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.238745 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.238756 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.238767 | orchestrator | 2025-09-27 00:47:08.238784 | orchestrator | TASK [k3s_server_post : Deploy calico] ***************************************** 2025-09-27 00:47:08.238796 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:00.297) 0:03:09.125 **** 2025-09-27 00:47:08.238807 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.238817 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.238828 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.238839 | orchestrator | 2025-09-27 00:47:08.238849 | orchestrator | TASK [k3s_server_post : Deploy cilium] ***************************************** 2025-09-27 00:47:08.238860 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:00.297) 0:03:09.422 **** 2025-09-27 00:47:08.238871 | orchestrator | included: /ansible/roles/k3s_server_post/tasks/cilium.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:47:08.238881 | orchestrator | 2025-09-27 00:47:08.238897 | orchestrator | TASK [k3s_server_post : Create tmp directory on first master] ****************** 2025-09-27 00:47:08.238915 | orchestrator | Saturday 27 September 2025 00:46:42 +0000 (0:00:00.735) 0:03:10.158 **** 2025-09-27 00:47:08.238926 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.238937 | orchestrator | 2025-09-27 00:47:08.238947 | orchestrator | TASK [k3s_server_post : Check if Cilium CLI is installed] ********************** 2025-09-27 00:47:08.238958 | orchestrator | Saturday 27 September 2025 00:46:42 +0000 (0:00:00.218) 0:03:10.377 **** 2025-09-27 00:47:08.238969 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.238979 | orchestrator | 2025-09-27 00:47:08.238990 | orchestrator | TASK [k3s_server_post : Check for Cilium CLI version in command output] ******** 2025-09-27 00:47:08.239001 | orchestrator | Saturday 27 September 2025 00:46:42 +0000 (0:00:00.330) 0:03:10.708 **** 2025-09-27 00:47:08.239011 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239022 | orchestrator | 2025-09-27 00:47:08.239032 | orchestrator | TASK [k3s_server_post : Get latest stable Cilium CLI version file] ************* 2025-09-27 00:47:08.239043 | orchestrator | Saturday 27 September 2025 00:46:43 +0000 (0:00:00.211) 0:03:10.919 **** 2025-09-27 00:47:08.239054 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239065 | orchestrator | 2025-09-27 00:47:08.239075 | orchestrator | TASK [k3s_server_post : Read Cilium CLI stable version from file] ************** 2025-09-27 00:47:08.239086 | orchestrator | Saturday 27 September 2025 00:46:43 +0000 (0:00:00.203) 0:03:11.123 **** 2025-09-27 00:47:08.239096 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239107 | orchestrator | 2025-09-27 00:47:08.239118 | orchestrator | TASK [k3s_server_post : Log installed Cilium CLI version] ********************** 2025-09-27 00:47:08.239129 | orchestrator | Saturday 27 September 2025 00:46:43 +0000 (0:00:00.196) 0:03:11.320 **** 2025-09-27 00:47:08.239139 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239150 | orchestrator | 2025-09-27 00:47:08.239160 | orchestrator | TASK [k3s_server_post : Log latest stable Cilium CLI version] ****************** 2025-09-27 00:47:08.239171 | orchestrator | Saturday 27 September 2025 00:46:43 +0000 (0:00:00.195) 0:03:11.516 **** 2025-09-27 00:47:08.239182 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239192 | orchestrator | 2025-09-27 00:47:08.239203 | orchestrator | TASK [k3s_server_post : Determine if Cilium CLI needs installation or update] *** 2025-09-27 00:47:08.239230 | orchestrator | Saturday 27 September 2025 00:46:43 +0000 (0:00:00.188) 0:03:11.704 **** 2025-09-27 00:47:08.239241 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239252 | orchestrator | 2025-09-27 00:47:08.239263 | orchestrator | TASK [k3s_server_post : Set architecture variable] ***************************** 2025-09-27 00:47:08.239273 | orchestrator | Saturday 27 September 2025 00:46:44 +0000 (0:00:00.290) 0:03:11.994 **** 2025-09-27 00:47:08.239284 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239295 | orchestrator | 2025-09-27 00:47:08.239305 | orchestrator | TASK [k3s_server_post : Download Cilium CLI and checksum] ********************** 2025-09-27 00:47:08.239316 | orchestrator | Saturday 27 September 2025 00:46:44 +0000 (0:00:00.203) 0:03:12.198 **** 2025-09-27 00:47:08.239327 | orchestrator | skipping: [testbed-node-0] => (item=.tar.gz)  2025-09-27 00:47:08.239337 | orchestrator | skipping: [testbed-node-0] => (item=.tar.gz.sha256sum)  2025-09-27 00:47:08.239348 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239358 | orchestrator | 2025-09-27 00:47:08.239369 | orchestrator | TASK [k3s_server_post : Verify the downloaded tarball] ************************* 2025-09-27 00:47:08.239380 | orchestrator | Saturday 27 September 2025 00:46:44 +0000 (0:00:00.547) 0:03:12.745 **** 2025-09-27 00:47:08.239391 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239401 | orchestrator | 2025-09-27 00:47:08.239412 | orchestrator | TASK [k3s_server_post : Extract Cilium CLI to /usr/local/bin] ****************** 2025-09-27 00:47:08.239423 | orchestrator | Saturday 27 September 2025 00:46:45 +0000 (0:00:00.192) 0:03:12.938 **** 2025-09-27 00:47:08.239434 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239445 | orchestrator | 2025-09-27 00:47:08.239455 | orchestrator | TASK [k3s_server_post : Remove downloaded tarball and checksum file] *********** 2025-09-27 00:47:08.239466 | orchestrator | Saturday 27 September 2025 00:46:45 +0000 (0:00:00.174) 0:03:13.112 **** 2025-09-27 00:47:08.239483 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239494 | orchestrator | 2025-09-27 00:47:08.239505 | orchestrator | TASK [k3s_server_post : Wait for connectivity to kube VIP] ********************* 2025-09-27 00:47:08.239516 | orchestrator | Saturday 27 September 2025 00:46:45 +0000 (0:00:00.173) 0:03:13.286 **** 2025-09-27 00:47:08.239526 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239537 | orchestrator | 2025-09-27 00:47:08.239548 | orchestrator | TASK [k3s_server_post : Fail if kube VIP not reachable] ************************ 2025-09-27 00:47:08.239558 | orchestrator | Saturday 27 September 2025 00:46:45 +0000 (0:00:00.165) 0:03:13.451 **** 2025-09-27 00:47:08.239569 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239580 | orchestrator | 2025-09-27 00:47:08.239590 | orchestrator | TASK [k3s_server_post : Test for existing Cilium install] ********************** 2025-09-27 00:47:08.239601 | orchestrator | Saturday 27 September 2025 00:46:45 +0000 (0:00:00.200) 0:03:13.652 **** 2025-09-27 00:47:08.239612 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239623 | orchestrator | 2025-09-27 00:47:08.239633 | orchestrator | TASK [k3s_server_post : Check Cilium version] ********************************** 2025-09-27 00:47:08.239644 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:00.180) 0:03:13.833 **** 2025-09-27 00:47:08.239655 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239666 | orchestrator | 2025-09-27 00:47:08.239684 | orchestrator | TASK [k3s_server_post : Pa2025-09-27 00:47:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:08.239695 | orchestrator | rse installed Cilium version] ************************ 2025-09-27 00:47:08.239706 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:00.195) 0:03:14.028 **** 2025-09-27 00:47:08.239717 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239727 | orchestrator | 2025-09-27 00:47:08.239738 | orchestrator | TASK [k3s_server_post : Determine if Cilium needs update] ********************** 2025-09-27 00:47:08.239749 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:00.175) 0:03:14.203 **** 2025-09-27 00:47:08.239759 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239770 | orchestrator | 2025-09-27 00:47:08.239781 | orchestrator | TASK [k3s_server_post : Log result] ******************************************** 2025-09-27 00:47:08.239796 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:00.173) 0:03:14.377 **** 2025-09-27 00:47:08.239807 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239818 | orchestrator | 2025-09-27 00:47:08.239828 | orchestrator | TASK [k3s_server_post : Install Cilium] **************************************** 2025-09-27 00:47:08.239839 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:00.188) 0:03:14.565 **** 2025-09-27 00:47:08.239850 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239861 | orchestrator | 2025-09-27 00:47:08.239871 | orchestrator | TASK [k3s_server_post : Wait for Cilium resources] ***************************** 2025-09-27 00:47:08.239882 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:00.205) 0:03:14.770 **** 2025-09-27 00:47:08.239892 | orchestrator | skipping: [testbed-node-0] => (item=deployment/cilium-operator)  2025-09-27 00:47:08.239903 | orchestrator | skipping: [testbed-node-0] => (item=daemonset/cilium)  2025-09-27 00:47:08.239914 | orchestrator | skipping: [testbed-node-0] => (item=deployment/hubble-relay)  2025-09-27 00:47:08.239924 | orchestrator | skipping: [testbed-node-0] => (item=deployment/hubble-ui)  2025-09-27 00:47:08.239935 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239945 | orchestrator | 2025-09-27 00:47:08.239956 | orchestrator | TASK [k3s_server_post : Set _cilium_bgp_neighbors fact] ************************ 2025-09-27 00:47:08.239967 | orchestrator | Saturday 27 September 2025 00:46:47 +0000 (0:00:00.726) 0:03:15.497 **** 2025-09-27 00:47:08.239977 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.239988 | orchestrator | 2025-09-27 00:47:08.239999 | orchestrator | TASK [k3s_server_post : Copy BGP manifests to first master] ******************** 2025-09-27 00:47:08.240010 | orchestrator | Saturday 27 September 2025 00:46:47 +0000 (0:00:00.202) 0:03:15.700 **** 2025-09-27 00:47:08.240027 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.240038 | orchestrator | 2025-09-27 00:47:08.240049 | orchestrator | TASK [k3s_server_post : Apply BGP manifests] *********************************** 2025-09-27 00:47:08.240060 | orchestrator | Saturday 27 September 2025 00:46:48 +0000 (0:00:00.190) 0:03:15.890 **** 2025-09-27 00:47:08.240070 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.240081 | orchestrator | 2025-09-27 00:47:08.240091 | orchestrator | TASK [k3s_server_post : Print error message if BGP manifests application fails] *** 2025-09-27 00:47:08.240102 | orchestrator | Saturday 27 September 2025 00:46:48 +0000 (0:00:00.170) 0:03:16.061 **** 2025-09-27 00:47:08.240112 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.240123 | orchestrator | 2025-09-27 00:47:08.240134 | orchestrator | TASK [k3s_server_post : Test for BGP config resources] ************************* 2025-09-27 00:47:08.240144 | orchestrator | Saturday 27 September 2025 00:46:48 +0000 (0:00:00.188) 0:03:16.250 **** 2025-09-27 00:47:08.240155 | orchestrator | skipping: [testbed-node-0] => (item=kubectl get CiliumBGPPeeringPolicy.cilium.io)  2025-09-27 00:47:08.240166 | orchestrator | skipping: [testbed-node-0] => (item=kubectl get CiliumLoadBalancerIPPool.cilium.io)  2025-09-27 00:47:08.240176 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.240187 | orchestrator | 2025-09-27 00:47:08.240197 | orchestrator | TASK [k3s_server_post : Deploy metallb pool] *********************************** 2025-09-27 00:47:08.240257 | orchestrator | Saturday 27 September 2025 00:46:48 +0000 (0:00:00.258) 0:03:16.509 **** 2025-09-27 00:47:08.240270 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.240281 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.240292 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.240303 | orchestrator | 2025-09-27 00:47:08.240314 | orchestrator | TASK [k3s_server_post : Remove tmp directory used for manifests] *************** 2025-09-27 00:47:08.240325 | orchestrator | Saturday 27 September 2025 00:46:49 +0000 (0:00:00.291) 0:03:16.800 **** 2025-09-27 00:47:08.240336 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.240346 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.240357 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.240368 | orchestrator | 2025-09-27 00:47:08.240379 | orchestrator | PLAY [Apply role k9s] ********************************************************** 2025-09-27 00:47:08.240390 | orchestrator | 2025-09-27 00:47:08.240401 | orchestrator | TASK [k9s : Gather variables for each operating system] ************************ 2025-09-27 00:47:08.240411 | orchestrator | Saturday 27 September 2025 00:46:49 +0000 (0:00:00.910) 0:03:17.710 **** 2025-09-27 00:47:08.240420 | orchestrator | ok: [testbed-manager] 2025-09-27 00:47:08.240430 | orchestrator | 2025-09-27 00:47:08.240440 | orchestrator | TASK [k9s : Include distribution specific install tasks] *********************** 2025-09-27 00:47:08.240449 | orchestrator | Saturday 27 September 2025 00:46:50 +0000 (0:00:00.124) 0:03:17.834 **** 2025-09-27 00:47:08.240459 | orchestrator | included: /ansible/roles/k9s/tasks/install-Debian-family.yml for testbed-manager 2025-09-27 00:47:08.240468 | orchestrator | 2025-09-27 00:47:08.240478 | orchestrator | TASK [k9s : Install k9s packages] ********************************************** 2025-09-27 00:47:08.240488 | orchestrator | Saturday 27 September 2025 00:46:50 +0000 (0:00:00.215) 0:03:18.050 **** 2025-09-27 00:47:08.240497 | orchestrator | changed: [testbed-manager] 2025-09-27 00:47:08.240507 | orchestrator | 2025-09-27 00:47:08.240516 | orchestrator | PLAY [Manage labels, annotations, and taints on all k3s nodes] ***************** 2025-09-27 00:47:08.240526 | orchestrator | 2025-09-27 00:47:08.240535 | orchestrator | TASK [Merge labels, annotations, and taints] *********************************** 2025-09-27 00:47:08.240545 | orchestrator | Saturday 27 September 2025 00:46:55 +0000 (0:00:05.389) 0:03:23.439 **** 2025-09-27 00:47:08.240561 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:47:08.240571 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:47:08.240580 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:47:08.240590 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:47:08.240599 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:47:08.240609 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:47:08.240618 | orchestrator | 2025-09-27 00:47:08.240634 | orchestrator | TASK [Manage labels] *********************************************************** 2025-09-27 00:47:08.240644 | orchestrator | Saturday 27 September 2025 00:46:56 +0000 (0:00:00.606) 0:03:24.045 **** 2025-09-27 00:47:08.240653 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2025-09-27 00:47:08.240663 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2025-09-27 00:47:08.240681 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2025-09-27 00:47:08.240691 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2025-09-27 00:47:08.240700 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/compute-plane=true) 2025-09-27 00:47:08.240710 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/control-plane=true) 2025-09-27 00:47:08.240719 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2025-09-27 00:47:08.240729 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2025-09-27 00:47:08.240738 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=openstack-control-plane=enabled) 2025-09-27 00:47:08.240748 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=openstack-control-plane=enabled) 2025-09-27 00:47:08.240757 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=openstack-control-plane=enabled) 2025-09-27 00:47:08.240767 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.kubernetes.io/worker=worker) 2025-09-27 00:47:08.240776 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2025-09-27 00:47:08.240786 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2025-09-27 00:47:08.240796 | orchestrator | ok: [testbed-node-5 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2025-09-27 00:47:08.240805 | orchestrator | ok: [testbed-node-4 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2025-09-27 00:47:08.240814 | orchestrator | ok: [testbed-node-3 -> localhost] => (item=node-role.osism.tech/rook-osd=true) 2025-09-27 00:47:08.240824 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/network-plane=true) 2025-09-27 00:47:08.240834 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2025-09-27 00:47:08.240843 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2025-09-27 00:47:08.240853 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mds=true) 2025-09-27 00:47:08.240862 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2025-09-27 00:47:08.240871 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2025-09-27 00:47:08.240881 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2025-09-27 00:47:08.240891 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mgr=true) 2025-09-27 00:47:08.240900 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2025-09-27 00:47:08.240910 | orchestrator | ok: [testbed-node-1 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2025-09-27 00:47:08.240919 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-mon=true) 2025-09-27 00:47:08.240929 | orchestrator | ok: [testbed-node-2 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2025-09-27 00:47:08.240938 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=node-role.osism.tech/rook-rgw=true) 2025-09-27 00:47:08.240948 | orchestrator | 2025-09-27 00:47:08.240957 | orchestrator | TASK [Manage annotations] ****************************************************** 2025-09-27 00:47:08.240967 | orchestrator | Saturday 27 September 2025 00:47:06 +0000 (0:00:10.477) 0:03:34.522 **** 2025-09-27 00:47:08.240977 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.240991 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.241001 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.241010 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.241020 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.241030 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.241039 | orchestrator | 2025-09-27 00:47:08.241049 | orchestrator | TASK [Manage taints] *********************************************************** 2025-09-27 00:47:08.241058 | orchestrator | Saturday 27 September 2025 00:47:07 +0000 (0:00:00.646) 0:03:35.169 **** 2025-09-27 00:47:08.241068 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:47:08.241077 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:47:08.241087 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:47:08.241096 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:47:08.241106 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:47:08.241115 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:47:08.241125 | orchestrator | 2025-09-27 00:47:08.241134 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:47:08.241144 | orchestrator | testbed-manager : ok=21  changed=11  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:47:08.241160 | orchestrator | testbed-node-0 : ok=42  changed=20  unreachable=0 failed=0 skipped=45  rescued=0 ignored=0 2025-09-27 00:47:08.241171 | orchestrator | testbed-node-1 : ok=39  changed=17  unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2025-09-27 00:47:08.241180 | orchestrator | testbed-node-2 : ok=39  changed=17  unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2025-09-27 00:47:08.241194 | orchestrator | testbed-node-3 : ok=19  changed=9  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2025-09-27 00:47:08.241204 | orchestrator | testbed-node-4 : ok=19  changed=9  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2025-09-27 00:47:08.241229 | orchestrator | testbed-node-5 : ok=19  changed=9  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2025-09-27 00:47:08.241239 | orchestrator | 2025-09-27 00:47:08.241249 | orchestrator | 2025-09-27 00:47:08.241259 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:47:08.241268 | orchestrator | Saturday 27 September 2025 00:47:07 +0000 (0:00:00.396) 0:03:35.565 **** 2025-09-27 00:47:08.241278 | orchestrator | =============================================================================== 2025-09-27 00:47:08.241287 | orchestrator | k3s_server : Verify that all nodes actually joined (check k3s-init.service if this fails) -- 54.59s 2025-09-27 00:47:08.241297 | orchestrator | k3s_server : Enable and check K3s service ------------------------------ 26.17s 2025-09-27 00:47:08.241306 | orchestrator | kubectl : Install required packages ------------------------------------ 12.24s 2025-09-27 00:47:08.241316 | orchestrator | k3s_agent : Manage k3s service ----------------------------------------- 11.83s 2025-09-27 00:47:08.241325 | orchestrator | Manage labels ---------------------------------------------------------- 10.48s 2025-09-27 00:47:08.241335 | orchestrator | kubectl : Add repository Debian ----------------------------------------- 6.14s 2025-09-27 00:47:08.241344 | orchestrator | k3s_download : Download k3s binary x64 ---------------------------------- 6.02s 2025-09-27 00:47:08.241353 | orchestrator | k9s : Install k9s packages ---------------------------------------------- 5.39s 2025-09-27 00:47:08.241363 | orchestrator | k3s_server : Remove manifests and folders that are only needed for bootstrapping cluster so k3s doesn't auto apply on start --- 3.25s 2025-09-27 00:47:08.241373 | orchestrator | k3s_server : Set _kube_vip_bgp_peers fact ------------------------------- 3.05s 2025-09-27 00:47:08.241382 | orchestrator | k3s_server : Init cluster inside the transient k3s-init service --------- 2.96s 2025-09-27 00:47:08.241397 | orchestrator | k3s_prereq : Enable IPv4 forwarding ------------------------------------- 2.92s 2025-09-27 00:47:08.241407 | orchestrator | k3s_download : Download k3s binary armhf -------------------------------- 2.14s 2025-09-27 00:47:08.241416 | orchestrator | k3s_download : Download k3s binary arm64 -------------------------------- 1.89s 2025-09-27 00:47:08.241426 | orchestrator | k3s_server : Copy vip manifest to first master -------------------------- 1.63s 2025-09-27 00:47:08.241435 | orchestrator | k3s_custom_registries : Create directory /etc/rancher/k3s --------------- 1.62s 2025-09-27 00:47:08.241445 | orchestrator | k3s_prereq : Load br_netfilter ------------------------------------------ 1.60s 2025-09-27 00:47:08.241454 | orchestrator | k3s_custom_registries : Insert registries into /etc/rancher/k3s/registries.yaml --- 1.58s 2025-09-27 00:47:08.241463 | orchestrator | k3s_server : Stop k3s-init ---------------------------------------------- 1.48s 2025-09-27 00:47:08.241473 | orchestrator | k3s_server : Create custom resolv.conf for k3s -------------------------- 1.46s 2025-09-27 00:47:11.259012 | orchestrator | 2025-09-27 00:47:11 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:11.259275 | orchestrator | 2025-09-27 00:47:11 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:11.259797 | orchestrator | 2025-09-27 00:47:11 | INFO  | Task 2829b99c-2790-4119-b8ac-e7a6a3070799 is in state STARTED 2025-09-27 00:47:11.260403 | orchestrator | 2025-09-27 00:47:11 | INFO  | Task 1d91c610-49e9-42f0-89a7-1a5acbff1807 is in state STARTED 2025-09-27 00:47:11.260424 | orchestrator | 2025-09-27 00:47:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:14.320301 | orchestrator | 2025-09-27 00:47:14 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:14.320789 | orchestrator | 2025-09-27 00:47:14 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:14.322541 | orchestrator | 2025-09-27 00:47:14 | INFO  | Task 2829b99c-2790-4119-b8ac-e7a6a3070799 is in state STARTED 2025-09-27 00:47:14.324267 | orchestrator | 2025-09-27 00:47:14 | INFO  | Task 1d91c610-49e9-42f0-89a7-1a5acbff1807 is in state STARTED 2025-09-27 00:47:14.324300 | orchestrator | 2025-09-27 00:47:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:17.371154 | orchestrator | 2025-09-27 00:47:17 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:17.371431 | orchestrator | 2025-09-27 00:47:17 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:17.372021 | orchestrator | 2025-09-27 00:47:17 | INFO  | Task 2829b99c-2790-4119-b8ac-e7a6a3070799 is in state SUCCESS 2025-09-27 00:47:17.372839 | orchestrator | 2025-09-27 00:47:17 | INFO  | Task 1d91c610-49e9-42f0-89a7-1a5acbff1807 is in state STARTED 2025-09-27 00:47:17.372861 | orchestrator | 2025-09-27 00:47:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:20.406312 | orchestrator | 2025-09-27 00:47:20 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:20.408390 | orchestrator | 2025-09-27 00:47:20 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:20.410610 | orchestrator | 2025-09-27 00:47:20 | INFO  | Task 1d91c610-49e9-42f0-89a7-1a5acbff1807 is in state SUCCESS 2025-09-27 00:47:20.410627 | orchestrator | 2025-09-27 00:47:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:23.454869 | orchestrator | 2025-09-27 00:47:23 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:23.454968 | orchestrator | 2025-09-27 00:47:23 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:23.455012 | orchestrator | 2025-09-27 00:47:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:26.490994 | orchestrator | 2025-09-27 00:47:26 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:26.492664 | orchestrator | 2025-09-27 00:47:26 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:26.492860 | orchestrator | 2025-09-27 00:47:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:29.536735 | orchestrator | 2025-09-27 00:47:29 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:29.539102 | orchestrator | 2025-09-27 00:47:29 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:29.539399 | orchestrator | 2025-09-27 00:47:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:32.568557 | orchestrator | 2025-09-27 00:47:32 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:32.573042 | orchestrator | 2025-09-27 00:47:32 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:32.573068 | orchestrator | 2025-09-27 00:47:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:35.617643 | orchestrator | 2025-09-27 00:47:35 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:35.620010 | orchestrator | 2025-09-27 00:47:35 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:35.620056 | orchestrator | 2025-09-27 00:47:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:38.658629 | orchestrator | 2025-09-27 00:47:38 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:38.660577 | orchestrator | 2025-09-27 00:47:38 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:38.660614 | orchestrator | 2025-09-27 00:47:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:41.697755 | orchestrator | 2025-09-27 00:47:41 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:41.698674 | orchestrator | 2025-09-27 00:47:41 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:41.698706 | orchestrator | 2025-09-27 00:47:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:44.731273 | orchestrator | 2025-09-27 00:47:44 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:44.733154 | orchestrator | 2025-09-27 00:47:44 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:44.733275 | orchestrator | 2025-09-27 00:47:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:47.777692 | orchestrator | 2025-09-27 00:47:47 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:47.780359 | orchestrator | 2025-09-27 00:47:47 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:47.780436 | orchestrator | 2025-09-27 00:47:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:50.819469 | orchestrator | 2025-09-27 00:47:50 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:50.820948 | orchestrator | 2025-09-27 00:47:50 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:50.821020 | orchestrator | 2025-09-27 00:47:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:53.866807 | orchestrator | 2025-09-27 00:47:53 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:53.868417 | orchestrator | 2025-09-27 00:47:53 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:53.868492 | orchestrator | 2025-09-27 00:47:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:56.905056 | orchestrator | 2025-09-27 00:47:56 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:56.906288 | orchestrator | 2025-09-27 00:47:56 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:56.906685 | orchestrator | 2025-09-27 00:47:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:47:59.946408 | orchestrator | 2025-09-27 00:47:59 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:47:59.946504 | orchestrator | 2025-09-27 00:47:59 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:47:59.946518 | orchestrator | 2025-09-27 00:47:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:02.977241 | orchestrator | 2025-09-27 00:48:02 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:02.977877 | orchestrator | 2025-09-27 00:48:02 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:02.977921 | orchestrator | 2025-09-27 00:48:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:06.027282 | orchestrator | 2025-09-27 00:48:06 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:06.027897 | orchestrator | 2025-09-27 00:48:06 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:06.027931 | orchestrator | 2025-09-27 00:48:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:09.057319 | orchestrator | 2025-09-27 00:48:09 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:09.057995 | orchestrator | 2025-09-27 00:48:09 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:09.058899 | orchestrator | 2025-09-27 00:48:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:12.090628 | orchestrator | 2025-09-27 00:48:12 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:12.091299 | orchestrator | 2025-09-27 00:48:12 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:12.091333 | orchestrator | 2025-09-27 00:48:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:15.135930 | orchestrator | 2025-09-27 00:48:15 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:15.137161 | orchestrator | 2025-09-27 00:48:15 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:15.137721 | orchestrator | 2025-09-27 00:48:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:18.190851 | orchestrator | 2025-09-27 00:48:18 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:18.191959 | orchestrator | 2025-09-27 00:48:18 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:18.192271 | orchestrator | 2025-09-27 00:48:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:21.237156 | orchestrator | 2025-09-27 00:48:21 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:21.238623 | orchestrator | 2025-09-27 00:48:21 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:21.238884 | orchestrator | 2025-09-27 00:48:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:24.287407 | orchestrator | 2025-09-27 00:48:24 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:24.287720 | orchestrator | 2025-09-27 00:48:24 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:24.287743 | orchestrator | 2025-09-27 00:48:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:27.316687 | orchestrator | 2025-09-27 00:48:27 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:27.316911 | orchestrator | 2025-09-27 00:48:27 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:27.316936 | orchestrator | 2025-09-27 00:48:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:30.352306 | orchestrator | 2025-09-27 00:48:30 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:30.354184 | orchestrator | 2025-09-27 00:48:30 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:30.354237 | orchestrator | 2025-09-27 00:48:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:33.391729 | orchestrator | 2025-09-27 00:48:33 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:33.392622 | orchestrator | 2025-09-27 00:48:33 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:33.392653 | orchestrator | 2025-09-27 00:48:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:36.438087 | orchestrator | 2025-09-27 00:48:36 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:36.440340 | orchestrator | 2025-09-27 00:48:36 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:36.440725 | orchestrator | 2025-09-27 00:48:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:39.486502 | orchestrator | 2025-09-27 00:48:39 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:39.488485 | orchestrator | 2025-09-27 00:48:39 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:39.488513 | orchestrator | 2025-09-27 00:48:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:42.540581 | orchestrator | 2025-09-27 00:48:42 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:42.543171 | orchestrator | 2025-09-27 00:48:42 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:42.543202 | orchestrator | 2025-09-27 00:48:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:45.587744 | orchestrator | 2025-09-27 00:48:45 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:45.590591 | orchestrator | 2025-09-27 00:48:45 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:45.590792 | orchestrator | 2025-09-27 00:48:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:48.640883 | orchestrator | 2025-09-27 00:48:48 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:48.641340 | orchestrator | 2025-09-27 00:48:48 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:48.641371 | orchestrator | 2025-09-27 00:48:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:51.691789 | orchestrator | 2025-09-27 00:48:51 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:51.694594 | orchestrator | 2025-09-27 00:48:51 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:51.694629 | orchestrator | 2025-09-27 00:48:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:54.744661 | orchestrator | 2025-09-27 00:48:54 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:54.750225 | orchestrator | 2025-09-27 00:48:54 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:54.750380 | orchestrator | 2025-09-27 00:48:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:48:57.791763 | orchestrator | 2025-09-27 00:48:57 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:48:57.791870 | orchestrator | 2025-09-27 00:48:57 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:48:57.791885 | orchestrator | 2025-09-27 00:48:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:00.841303 | orchestrator | 2025-09-27 00:49:00 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:00.841960 | orchestrator | 2025-09-27 00:49:00 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:00.842132 | orchestrator | 2025-09-27 00:49:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:03.885973 | orchestrator | 2025-09-27 00:49:03 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:03.888352 | orchestrator | 2025-09-27 00:49:03 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:03.888396 | orchestrator | 2025-09-27 00:49:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:06.925962 | orchestrator | 2025-09-27 00:49:06 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:06.926503 | orchestrator | 2025-09-27 00:49:06 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:06.926532 | orchestrator | 2025-09-27 00:49:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:09.971499 | orchestrator | 2025-09-27 00:49:09 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:09.972168 | orchestrator | 2025-09-27 00:49:09 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:09.972228 | orchestrator | 2025-09-27 00:49:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:13.020492 | orchestrator | 2025-09-27 00:49:13 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:13.022550 | orchestrator | 2025-09-27 00:49:13 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:13.022770 | orchestrator | 2025-09-27 00:49:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:16.075534 | orchestrator | 2025-09-27 00:49:16 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:16.075642 | orchestrator | 2025-09-27 00:49:16 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:16.075659 | orchestrator | 2025-09-27 00:49:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:19.127969 | orchestrator | 2025-09-27 00:49:19 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:19.129493 | orchestrator | 2025-09-27 00:49:19 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:19.129526 | orchestrator | 2025-09-27 00:49:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:22.169616 | orchestrator | 2025-09-27 00:49:22 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:22.170645 | orchestrator | 2025-09-27 00:49:22 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:22.170701 | orchestrator | 2025-09-27 00:49:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:25.208338 | orchestrator | 2025-09-27 00:49:25 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:25.209083 | orchestrator | 2025-09-27 00:49:25 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:25.209116 | orchestrator | 2025-09-27 00:49:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:28.260881 | orchestrator | 2025-09-27 00:49:28 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:28.263509 | orchestrator | 2025-09-27 00:49:28 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:28.263838 | orchestrator | 2025-09-27 00:49:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:31.313378 | orchestrator | 2025-09-27 00:49:31 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:31.314591 | orchestrator | 2025-09-27 00:49:31 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state STARTED 2025-09-27 00:49:31.314623 | orchestrator | 2025-09-27 00:49:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:34.364651 | orchestrator | 2025-09-27 00:49:34 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:34.364754 | orchestrator | 2025-09-27 00:49:34 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:34.366140 | orchestrator | 2025-09-27 00:49:34 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:34.376558 | orchestrator | 2025-09-27 00:49:34 | INFO  | Task 4bac4415-47ae-418f-938a-24c52f1a62d3 is in state SUCCESS 2025-09-27 00:49:34.377381 | orchestrator | 2025-09-27 00:49:34.377406 | orchestrator | 2025-09-27 00:49:34.377419 | orchestrator | PLAY [Copy kubeconfig to the configuration repository] ************************* 2025-09-27 00:49:34.377433 | orchestrator | 2025-09-27 00:49:34.377445 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2025-09-27 00:49:34.377458 | orchestrator | Saturday 27 September 2025 00:47:11 +0000 (0:00:00.142) 0:00:00.143 **** 2025-09-27 00:49:34.377471 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2025-09-27 00:49:34.377483 | orchestrator | 2025-09-27 00:49:34.377495 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2025-09-27 00:49:34.377507 | orchestrator | Saturday 27 September 2025 00:47:12 +0000 (0:00:00.766) 0:00:00.909 **** 2025-09-27 00:49:34.377520 | orchestrator | changed: [testbed-manager] 2025-09-27 00:49:34.377533 | orchestrator | 2025-09-27 00:49:34.377544 | orchestrator | TASK [Change server address in the kubeconfig file] **************************** 2025-09-27 00:49:34.377555 | orchestrator | Saturday 27 September 2025 00:47:13 +0000 (0:00:01.068) 0:00:01.978 **** 2025-09-27 00:49:34.377566 | orchestrator | changed: [testbed-manager] 2025-09-27 00:49:34.377577 | orchestrator | 2025-09-27 00:49:34.377587 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:49:34.377598 | orchestrator | testbed-manager : ok=3  changed=2  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:49:34.377610 | orchestrator | 2025-09-27 00:49:34.377621 | orchestrator | 2025-09-27 00:49:34.377650 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:49:34.377661 | orchestrator | Saturday 27 September 2025 00:47:14 +0000 (0:00:00.412) 0:00:02.390 **** 2025-09-27 00:49:34.377771 | orchestrator | =============================================================================== 2025-09-27 00:49:34.377787 | orchestrator | Write kubeconfig file --------------------------------------------------- 1.07s 2025-09-27 00:49:34.377798 | orchestrator | Get kubeconfig file ----------------------------------------------------- 0.77s 2025-09-27 00:49:34.377834 | orchestrator | Change server address in the kubeconfig file ---------------------------- 0.41s 2025-09-27 00:49:34.377845 | orchestrator | 2025-09-27 00:49:34.377856 | orchestrator | 2025-09-27 00:49:34.377867 | orchestrator | PLAY [Prepare kubeconfig file] ************************************************* 2025-09-27 00:49:34.377877 | orchestrator | 2025-09-27 00:49:34.377888 | orchestrator | TASK [Get home directory of operator user] ************************************* 2025-09-27 00:49:34.377898 | orchestrator | Saturday 27 September 2025 00:47:11 +0000 (0:00:00.167) 0:00:00.167 **** 2025-09-27 00:49:34.377909 | orchestrator | ok: [testbed-manager] 2025-09-27 00:49:34.377921 | orchestrator | 2025-09-27 00:49:34.377931 | orchestrator | TASK [Create .kube directory] ************************************************** 2025-09-27 00:49:34.377942 | orchestrator | Saturday 27 September 2025 00:47:12 +0000 (0:00:00.581) 0:00:00.749 **** 2025-09-27 00:49:34.378091 | orchestrator | ok: [testbed-manager] 2025-09-27 00:49:34.378104 | orchestrator | 2025-09-27 00:49:34.378115 | orchestrator | TASK [Get kubeconfig file] ***************************************************** 2025-09-27 00:49:34.378125 | orchestrator | Saturday 27 September 2025 00:47:12 +0000 (0:00:00.496) 0:00:01.245 **** 2025-09-27 00:49:34.378136 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] 2025-09-27 00:49:34.378147 | orchestrator | 2025-09-27 00:49:34.378158 | orchestrator | TASK [Write kubeconfig file] *************************************************** 2025-09-27 00:49:34.378221 | orchestrator | Saturday 27 September 2025 00:47:13 +0000 (0:00:00.684) 0:00:01.929 **** 2025-09-27 00:49:34.378234 | orchestrator | changed: [testbed-manager] 2025-09-27 00:49:34.378244 | orchestrator | 2025-09-27 00:49:34.378255 | orchestrator | TASK [Change server address in the kubeconfig] ********************************* 2025-09-27 00:49:34.378266 | orchestrator | Saturday 27 September 2025 00:47:14 +0000 (0:00:01.011) 0:00:02.941 **** 2025-09-27 00:49:34.378338 | orchestrator | changed: [testbed-manager] 2025-09-27 00:49:34.378350 | orchestrator | 2025-09-27 00:49:34.378361 | orchestrator | TASK [Make kubeconfig available for use inside the manager service] ************ 2025-09-27 00:49:34.378371 | orchestrator | Saturday 27 September 2025 00:47:15 +0000 (0:00:00.774) 0:00:03.715 **** 2025-09-27 00:49:34.378382 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-27 00:49:34.378393 | orchestrator | 2025-09-27 00:49:34.378404 | orchestrator | TASK [Change server address in the kubeconfig inside the manager service] ****** 2025-09-27 00:49:34.378415 | orchestrator | Saturday 27 September 2025 00:47:16 +0000 (0:00:01.578) 0:00:05.293 **** 2025-09-27 00:49:34.378427 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-27 00:49:34.378519 | orchestrator | 2025-09-27 00:49:34.378531 | orchestrator | TASK [Set KUBECONFIG environment variable] ************************************* 2025-09-27 00:49:34.378543 | orchestrator | Saturday 27 September 2025 00:47:17 +0000 (0:00:00.825) 0:00:06.118 **** 2025-09-27 00:49:34.382264 | orchestrator | ok: [testbed-manager] 2025-09-27 00:49:34.382352 | orchestrator | 2025-09-27 00:49:34.382369 | orchestrator | TASK [Enable kubectl command line completion] ********************************** 2025-09-27 00:49:34.382382 | orchestrator | Saturday 27 September 2025 00:47:18 +0000 (0:00:00.422) 0:00:06.541 **** 2025-09-27 00:49:34.382393 | orchestrator | ok: [testbed-manager] 2025-09-27 00:49:34.382403 | orchestrator | 2025-09-27 00:49:34.382414 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:49:34.382425 | orchestrator | testbed-manager : ok=9  changed=4  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:49:34.382436 | orchestrator | 2025-09-27 00:49:34.382447 | orchestrator | 2025-09-27 00:49:34.382458 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:49:34.382469 | orchestrator | Saturday 27 September 2025 00:47:18 +0000 (0:00:00.299) 0:00:06.841 **** 2025-09-27 00:49:34.382480 | orchestrator | =============================================================================== 2025-09-27 00:49:34.382490 | orchestrator | Make kubeconfig available for use inside the manager service ------------ 1.58s 2025-09-27 00:49:34.382502 | orchestrator | Write kubeconfig file --------------------------------------------------- 1.01s 2025-09-27 00:49:34.382542 | orchestrator | Change server address in the kubeconfig inside the manager service ------ 0.83s 2025-09-27 00:49:34.382588 | orchestrator | Change server address in the kubeconfig --------------------------------- 0.77s 2025-09-27 00:49:34.382600 | orchestrator | Get kubeconfig file ----------------------------------------------------- 0.68s 2025-09-27 00:49:34.382611 | orchestrator | Get home directory of operator user ------------------------------------- 0.58s 2025-09-27 00:49:34.382622 | orchestrator | Create .kube directory -------------------------------------------------- 0.50s 2025-09-27 00:49:34.382633 | orchestrator | Set KUBECONFIG environment variable ------------------------------------- 0.42s 2025-09-27 00:49:34.382644 | orchestrator | Enable kubectl command line completion ---------------------------------- 0.30s 2025-09-27 00:49:34.382654 | orchestrator | 2025-09-27 00:49:34.382665 | orchestrator | 2025-09-27 00:49:34.382676 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:49:34.382686 | orchestrator | 2025-09-27 00:49:34.382697 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:49:34.382708 | orchestrator | Saturday 27 September 2025 00:44:07 +0000 (0:00:01.041) 0:00:01.041 **** 2025-09-27 00:49:34.382719 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:49:34.382730 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:49:34.382740 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:49:34.382751 | orchestrator | 2025-09-27 00:49:34.382762 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:49:34.382773 | orchestrator | Saturday 27 September 2025 00:44:08 +0000 (0:00:00.824) 0:00:01.866 **** 2025-09-27 00:49:34.382797 | orchestrator | ok: [testbed-node-0] => (item=enable_loadbalancer_True) 2025-09-27 00:49:34.382809 | orchestrator | ok: [testbed-node-1] => (item=enable_loadbalancer_True) 2025-09-27 00:49:34.382820 | orchestrator | ok: [testbed-node-2] => (item=enable_loadbalancer_True) 2025-09-27 00:49:34.382830 | orchestrator | 2025-09-27 00:49:34.382841 | orchestrator | PLAY [Apply role loadbalancer] ************************************************* 2025-09-27 00:49:34.382852 | orchestrator | 2025-09-27 00:49:34.382863 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2025-09-27 00:49:34.382874 | orchestrator | Saturday 27 September 2025 00:44:11 +0000 (0:00:02.602) 0:00:04.469 **** 2025-09-27 00:49:34.382885 | orchestrator | included: /ansible/roles/loadbalancer/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.382896 | orchestrator | 2025-09-27 00:49:34.382907 | orchestrator | TASK [loadbalancer : Check IPv6 support] *************************************** 2025-09-27 00:49:34.382917 | orchestrator | Saturday 27 September 2025 00:44:12 +0000 (0:00:01.626) 0:00:06.095 **** 2025-09-27 00:49:34.382928 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:49:34.382939 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:49:34.382950 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:49:34.382960 | orchestrator | 2025-09-27 00:49:34.382971 | orchestrator | TASK [Setting sysctl values] *************************************************** 2025-09-27 00:49:34.382982 | orchestrator | Saturday 27 September 2025 00:44:14 +0000 (0:00:01.319) 0:00:07.414 **** 2025-09-27 00:49:34.382993 | orchestrator | included: sysctl for testbed-node-2, testbed-node-0, testbed-node-1 2025-09-27 00:49:34.383004 | orchestrator | 2025-09-27 00:49:34.383014 | orchestrator | TASK [sysctl : Check IPv6 support] ********************************************* 2025-09-27 00:49:34.383025 | orchestrator | Saturday 27 September 2025 00:44:15 +0000 (0:00:01.207) 0:00:08.622 **** 2025-09-27 00:49:34.383036 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:49:34.383047 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:49:34.383057 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:49:34.383068 | orchestrator | 2025-09-27 00:49:34.383079 | orchestrator | TASK [sysctl : Setting sysctl values] ****************************************** 2025-09-27 00:49:34.383090 | orchestrator | Saturday 27 September 2025 00:44:17 +0000 (0:00:02.065) 0:00:10.687 **** 2025-09-27 00:49:34.383100 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-09-27 00:49:34.383119 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-09-27 00:49:34.383130 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv6.ip_nonlocal_bind', 'value': 1}) 2025-09-27 00:49:34.383141 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-09-27 00:49:34.383152 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-09-27 00:49:34.383163 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.ipv4.ip_nonlocal_bind', 'value': 1}) 2025-09-27 00:49:34.383174 | orchestrator | ok: [testbed-node-0] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-09-27 00:49:34.383186 | orchestrator | ok: [testbed-node-1] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-09-27 00:49:34.383197 | orchestrator | ok: [testbed-node-2] => (item={'name': 'net.ipv4.tcp_retries2', 'value': 'KOLLA_UNSET'}) 2025-09-27 00:49:34.383227 | orchestrator | changed: [testbed-node-0] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-09-27 00:49:34.383238 | orchestrator | changed: [testbed-node-2] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-09-27 00:49:34.383249 | orchestrator | changed: [testbed-node-1] => (item={'name': 'net.unix.max_dgram_qlen', 'value': 128}) 2025-09-27 00:49:34.383259 | orchestrator | 2025-09-27 00:49:34.383270 | orchestrator | TASK [module-load : Load modules] ********************************************** 2025-09-27 00:49:34.383281 | orchestrator | Saturday 27 September 2025 00:44:21 +0000 (0:00:03.846) 0:00:14.534 **** 2025-09-27 00:49:34.383292 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2025-09-27 00:49:34.383303 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2025-09-27 00:49:34.383314 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2025-09-27 00:49:34.383324 | orchestrator | 2025-09-27 00:49:34.383335 | orchestrator | TASK [module-load : Persist modules via modules-load.d] ************************ 2025-09-27 00:49:34.383357 | orchestrator | Saturday 27 September 2025 00:44:22 +0000 (0:00:01.385) 0:00:15.920 **** 2025-09-27 00:49:34.383368 | orchestrator | changed: [testbed-node-1] => (item=ip_vs) 2025-09-27 00:49:34.383379 | orchestrator | changed: [testbed-node-0] => (item=ip_vs) 2025-09-27 00:49:34.383390 | orchestrator | changed: [testbed-node-2] => (item=ip_vs) 2025-09-27 00:49:34.383401 | orchestrator | 2025-09-27 00:49:34.383411 | orchestrator | TASK [module-load : Drop module persistence] *********************************** 2025-09-27 00:49:34.383422 | orchestrator | Saturday 27 September 2025 00:44:24 +0000 (0:00:01.866) 0:00:17.787 **** 2025-09-27 00:49:34.383433 | orchestrator | skipping: [testbed-node-0] => (item=ip_vs)  2025-09-27 00:49:34.383443 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.383454 | orchestrator | skipping: [testbed-node-1] => (item=ip_vs)  2025-09-27 00:49:34.383465 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.383475 | orchestrator | skipping: [testbed-node-2] => (item=ip_vs)  2025-09-27 00:49:34.383486 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.383497 | orchestrator | 2025-09-27 00:49:34.383507 | orchestrator | TASK [loadbalancer : Ensuring config directories exist] ************************ 2025-09-27 00:49:34.383518 | orchestrator | Saturday 27 September 2025 00:44:25 +0000 (0:00:00.689) 0:00:18.476 **** 2025-09-27 00:49:34.383532 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.383550 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.383639 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.383661 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.383673 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.383696 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.383709 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.383725 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.383744 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.383755 | orchestrator | 2025-09-27 00:49:34.383767 | orchestrator | TASK [loadbalancer : Ensuring haproxy service config subdir exists] ************ 2025-09-27 00:49:34.383778 | orchestrator | Saturday 27 September 2025 00:44:28 +0000 (0:00:03.388) 0:00:21.865 **** 2025-09-27 00:49:34.383789 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.383800 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.383810 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.383821 | orchestrator | 2025-09-27 00:49:34.383832 | orchestrator | TASK [loadbalancer : Ensuring proxysql service config subdirectories exist] **** 2025-09-27 00:49:34.383843 | orchestrator | Saturday 27 September 2025 00:44:29 +0000 (0:00:01.249) 0:00:23.114 **** 2025-09-27 00:49:34.383854 | orchestrator | changed: [testbed-node-0] => (item=users) 2025-09-27 00:49:34.383864 | orchestrator | changed: [testbed-node-1] => (item=users) 2025-09-27 00:49:34.383875 | orchestrator | changed: [testbed-node-2] => (item=users) 2025-09-27 00:49:34.383886 | orchestrator | changed: [testbed-node-0] => (item=rules) 2025-09-27 00:49:34.383896 | orchestrator | changed: [testbed-node-1] => (item=rules) 2025-09-27 00:49:34.383907 | orchestrator | changed: [testbed-node-2] => (item=rules) 2025-09-27 00:49:34.383917 | orchestrator | 2025-09-27 00:49:34.383928 | orchestrator | TASK [loadbalancer : Ensuring keepalived checks subdir exists] ***************** 2025-09-27 00:49:34.383939 | orchestrator | Saturday 27 September 2025 00:44:31 +0000 (0:00:02.175) 0:00:25.289 **** 2025-09-27 00:49:34.383949 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.383960 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.383970 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.383981 | orchestrator | 2025-09-27 00:49:34.383992 | orchestrator | TASK [loadbalancer : Remove mariadb.cfg if proxysql enabled] ******************* 2025-09-27 00:49:34.384003 | orchestrator | Saturday 27 September 2025 00:44:33 +0000 (0:00:01.411) 0:00:26.701 **** 2025-09-27 00:49:34.384013 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:49:34.384024 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:49:34.384035 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:49:34.384045 | orchestrator | 2025-09-27 00:49:34.384056 | orchestrator | TASK [loadbalancer : Removing checks for services which are disabled] ********** 2025-09-27 00:49:34.384067 | orchestrator | Saturday 27 September 2025 00:44:34 +0000 (0:00:01.642) 0:00:28.343 **** 2025-09-27 00:49:34.384078 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.384097 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.384120 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.384133 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-27 00:49:34.384144 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.384156 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.384168 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.384179 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.384216 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-27 00:49:34.384234 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.384246 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.384262 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.384274 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.384287 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-27 00:49:34.384298 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.384309 | orchestrator | 2025-09-27 00:49:34.384320 | orchestrator | TASK [loadbalancer : Copying checks for services which are enabled] ************ 2025-09-27 00:49:34.384331 | orchestrator | Saturday 27 September 2025 00:44:35 +0000 (0:00:00.728) 0:00:29.072 **** 2025-09-27 00:49:34.384343 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384361 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384379 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384395 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384407 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.384418 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-27 00:49:34.384430 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384442 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384467 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.384479 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-27 00:49:34.384495 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.384507 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy-ssh', 'value': {'container_name': 'haproxy_ssh', 'group': 'loadbalancer', 'enabled': False, 'image': 'registry.osism.tech/kolla/haproxy-ssh:2024.2', 'volumes': ['/etc/kolla/haproxy-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384', '__omit_place_holder__f72f76ac0d1b11429434a8aff5d8056b51dd2384'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 2985'], 'timeout': '30'}}})  2025-09-27 00:49:34.384518 | orchestrator | 2025-09-27 00:49:34.384530 | orchestrator | TASK [loadbalancer : Copying over config.json files for services] ************** 2025-09-27 00:49:34.384541 | orchestrator | Saturday 27 September 2025 00:44:39 +0000 (0:00:03.371) 0:00:32.443 **** 2025-09-27 00:49:34.384552 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384564 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384587 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384599 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384615 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384627 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.384638 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.384650 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.384661 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.384679 | orchestrator | 2025-09-27 00:49:34.384690 | orchestrator | TASK [loadbalancer : Copying over haproxy.cfg] ********************************* 2025-09-27 00:49:34.384701 | orchestrator | Saturday 27 September 2025 00:44:43 +0000 (0:00:04.003) 0:00:36.446 **** 2025-09-27 00:49:34.384712 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-09-27 00:49:34.384724 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-09-27 00:49:34.384735 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_main.cfg.j2) 2025-09-27 00:49:34.384746 | orchestrator | 2025-09-27 00:49:34.384763 | orchestrator | TASK [loadbalancer : Copying over proxysql config] ***************************** 2025-09-27 00:49:34.384774 | orchestrator | Saturday 27 September 2025 00:44:45 +0000 (0:00:02.517) 0:00:38.964 **** 2025-09-27 00:49:34.384785 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-09-27 00:49:34.384796 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-09-27 00:49:34.384807 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql.yaml.j2) 2025-09-27 00:49:34.384818 | orchestrator | 2025-09-27 00:49:34.384829 | orchestrator | TASK [loadbalancer : Copying over haproxy single external frontend config] ***** 2025-09-27 00:49:34.384840 | orchestrator | Saturday 27 September 2025 00:44:48 +0000 (0:00:03.088) 0:00:42.052 **** 2025-09-27 00:49:34.384851 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.384862 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.384872 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.384883 | orchestrator | 2025-09-27 00:49:34.384894 | orchestrator | TASK [loadbalancer : Copying over custom haproxy services configuration] ******* 2025-09-27 00:49:34.384905 | orchestrator | Saturday 27 September 2025 00:44:49 +0000 (0:00:00.570) 0:00:42.623 **** 2025-09-27 00:49:34.384924 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-09-27 00:49:34.384936 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-09-27 00:49:34.384947 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/haproxy/services.d/haproxy.cfg) 2025-09-27 00:49:34.384958 | orchestrator | 2025-09-27 00:49:34.384969 | orchestrator | TASK [loadbalancer : Copying over keepalived.conf] ***************************** 2025-09-27 00:49:34.384980 | orchestrator | Saturday 27 September 2025 00:44:51 +0000 (0:00:02.324) 0:00:44.947 **** 2025-09-27 00:49:34.384991 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-09-27 00:49:34.385002 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-09-27 00:49:34.385013 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/keepalived/keepalived.conf.j2) 2025-09-27 00:49:34.385023 | orchestrator | 2025-09-27 00:49:34.385034 | orchestrator | TASK [loadbalancer : Copying over haproxy.pem] ********************************* 2025-09-27 00:49:34.385045 | orchestrator | Saturday 27 September 2025 00:44:53 +0000 (0:00:01.937) 0:00:46.885 **** 2025-09-27 00:49:34.385056 | orchestrator | changed: [testbed-node-0] => (item=haproxy.pem) 2025-09-27 00:49:34.385067 | orchestrator | changed: [testbed-node-1] => (item=haproxy.pem) 2025-09-27 00:49:34.385078 | orchestrator | changed: [testbed-node-2] => (item=haproxy.pem) 2025-09-27 00:49:34.385089 | orchestrator | 2025-09-27 00:49:34.385100 | orchestrator | TASK [loadbalancer : Copying over haproxy-internal.pem] ************************ 2025-09-27 00:49:34.385117 | orchestrator | Saturday 27 September 2025 00:44:54 +0000 (0:00:01.401) 0:00:48.287 **** 2025-09-27 00:49:34.385128 | orchestrator | changed: [testbed-node-0] => (item=haproxy-internal.pem) 2025-09-27 00:49:34.385139 | orchestrator | changed: [testbed-node-1] => (item=haproxy-internal.pem) 2025-09-27 00:49:34.385150 | orchestrator | changed: [testbed-node-2] => (item=haproxy-internal.pem) 2025-09-27 00:49:34.385161 | orchestrator | 2025-09-27 00:49:34.385172 | orchestrator | TASK [loadbalancer : include_tasks] ******************************************** 2025-09-27 00:49:34.385182 | orchestrator | Saturday 27 September 2025 00:44:56 +0000 (0:00:01.353) 0:00:49.641 **** 2025-09-27 00:49:34.385193 | orchestrator | included: /ansible/roles/loadbalancer/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.385232 | orchestrator | 2025-09-27 00:49:34.385244 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over extra CA certificates] *** 2025-09-27 00:49:34.385255 | orchestrator | Saturday 27 September 2025 00:44:56 +0000 (0:00:00.487) 0:00:50.128 **** 2025-09-27 00:49:34.385266 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.385285 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.385296 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.385313 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.385325 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.385343 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.385355 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.385366 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.385384 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.385396 | orchestrator | 2025-09-27 00:49:34.385407 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over backend internal TLS certificate] *** 2025-09-27 00:49:34.385418 | orchestrator | Saturday 27 September 2025 00:45:00 +0000 (0:00:03.312) 0:00:53.441 **** 2025-09-27 00:49:34.385429 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385445 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385463 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385475 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.385486 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385498 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385509 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385520 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.385539 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385556 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385568 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385585 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.385596 | orchestrator | 2025-09-27 00:49:34.385607 | orchestrator | TASK [service-cert-copy : loadbalancer | Copying over backend internal TLS key] *** 2025-09-27 00:49:34.385618 | orchestrator | Saturday 27 September 2025 00:45:00 +0000 (0:00:00.555) 0:00:53.996 **** 2025-09-27 00:49:34.385630 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385641 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385653 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385664 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.385681 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385693 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385709 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385727 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.385739 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385750 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385761 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385773 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.385784 | orchestrator | 2025-09-27 00:49:34.385795 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2025-09-27 00:49:34.385806 | orchestrator | Saturday 27 September 2025 00:45:01 +0000 (0:00:00.794) 0:00:54.791 **** 2025-09-27 00:49:34.385817 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385836 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385848 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385865 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.385882 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385893 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385905 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385916 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.385927 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.385945 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.385957 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.385976 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.385987 | orchestrator | 2025-09-27 00:49:34.385998 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2025-09-27 00:49:34.386009 | orchestrator | Saturday 27 September 2025 00:45:02 +0000 (0:00:00.755) 0:00:55.547 **** 2025-09-27 00:49:34.386096 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386110 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386121 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386132 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.386144 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386155 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386174 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386216 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.386229 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386249 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386261 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386272 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.386283 | orchestrator | 2025-09-27 00:49:34.386294 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2025-09-27 00:49:34.386305 | orchestrator | Saturday 27 September 2025 00:45:03 +0000 (0:00:01.175) 0:00:56.722 **** 2025-09-27 00:49:34.386316 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386327 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386339 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386365 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386377 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386388 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.386399 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386410 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.386422 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386433 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386445 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386462 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.386473 | orchestrator | 2025-09-27 00:49:34.386484 | orchestrator | TASK [service-cert-copy : proxysql | Copying over extra CA certificates] ******* 2025-09-27 00:49:34.386495 | orchestrator | Saturday 27 September 2025 00:45:05 +0000 (0:00:01.636) 0:00:58.358 **** 2025-09-27 00:49:34.386529 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386542 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386558 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386570 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.386581 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386592 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386604 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386621 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.386633 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386651 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386662 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386678 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.386690 | orchestrator | 2025-09-27 00:49:34.386701 | orchestrator | TASK [service-cert-copy : proxysql | Copying over backend internal TLS certificate] *** 2025-09-27 00:49:34.386712 | orchestrator | Saturday 27 September 2025 00:45:06 +0000 (0:00:01.705) 0:01:00.064 **** 2025-09-27 00:49:34.386723 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386735 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386746 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386764 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.386775 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386793 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386805 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386816 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.386832 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386844 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386855 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386866 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.386877 | orchestrator | 2025-09-27 00:49:34.386888 | orchestrator | TASK [service-cert-copy : proxysql | Copying over backend internal TLS key] **** 2025-09-27 00:49:34.386906 | orchestrator | Saturday 27 September 2025 00:45:07 +0000 (0:00:01.245) 0:01:01.310 **** 2025-09-27 00:49:34.386917 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386929 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386948 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.386959 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.386975 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.386987 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.386999 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.387010 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.387021 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}})  2025-09-27 00:49:34.387039 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}})  2025-09-27 00:49:34.387050 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}})  2025-09-27 00:49:34.387062 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.387073 | orchestrator | 2025-09-27 00:49:34.387090 | orchestrator | TASK [loadbalancer : Copying over haproxy start script] ************************ 2025-09-27 00:49:34.387102 | orchestrator | Saturday 27 September 2025 00:45:09 +0000 (0:00:01.294) 0:01:02.605 **** 2025-09-27 00:49:34.387113 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-09-27 00:49:34.387124 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-09-27 00:49:34.387134 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/haproxy/haproxy_run.sh.j2) 2025-09-27 00:49:34.387145 | orchestrator | 2025-09-27 00:49:34.387156 | orchestrator | TASK [loadbalancer : Copying over proxysql start script] *********************** 2025-09-27 00:49:34.387167 | orchestrator | Saturday 27 September 2025 00:45:10 +0000 (0:00:01.711) 0:01:04.316 **** 2025-09-27 00:49:34.387178 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-09-27 00:49:34.387189 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-09-27 00:49:34.387218 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/loadbalancer/templates/proxysql/proxysql_run.sh.j2) 2025-09-27 00:49:34.387229 | orchestrator | 2025-09-27 00:49:34.387245 | orchestrator | TASK [loadbalancer : Copying files for haproxy-ssh] **************************** 2025-09-27 00:49:34.387256 | orchestrator | Saturday 27 September 2025 00:45:12 +0000 (0:00:01.378) 0:01:05.695 **** 2025-09-27 00:49:34.387267 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-09-27 00:49:34.387278 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-09-27 00:49:34.387289 | orchestrator | skipping: [testbed-node-0] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-09-27 00:49:34.387299 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/sshd_config.j2', 'dest': 'sshd_config'})  2025-09-27 00:49:34.387310 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.387327 | orchestrator | skipping: [testbed-node-1] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-09-27 00:49:34.387338 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.387349 | orchestrator | skipping: [testbed-node-2] => (item={'src': 'haproxy-ssh/id_rsa.pub', 'dest': 'id_rsa.pub'})  2025-09-27 00:49:34.387360 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.387371 | orchestrator | 2025-09-27 00:49:34.387382 | orchestrator | TASK [loadbalancer : Check loadbalancer containers] **************************** 2025-09-27 00:49:34.387393 | orchestrator | Saturday 27 September 2025 00:45:13 +0000 (0:00:00.977) 0:01:06.672 **** 2025-09-27 00:49:34.387404 | orchestrator | changed: [testbed-node-1] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.387416 | orchestrator | changed: [testbed-node-2] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.387428 | orchestrator | changed: [testbed-node-0] => (item={'key': 'haproxy', 'value': {'container_name': 'haproxy', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/haproxy:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/haproxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'letsencrypt_certificates:/etc/haproxy/certificates'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:61313'], 'timeout': '30'}}}) 2025-09-27 00:49:34.387446 | orchestrator | changed: [testbed-node-2] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.387462 | orchestrator | changed: [testbed-node-1] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.387474 | orchestrator | changed: [testbed-node-0] => (item={'key': 'proxysql', 'value': {'container_name': 'proxysql', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/proxysql:2024.2', 'privileged': False, 'volumes': ['/etc/kolla/proxysql/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'proxysql:/var/lib/proxysql/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen proxysql 6032'], 'timeout': '30'}}}) 2025-09-27 00:49:34.387496 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.387508 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.387519 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keepalived', 'value': {'container_name': 'keepalived', 'group': 'loadbalancer', 'enabled': True, 'image': 'registry.osism.tech/kolla/keepalived:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/keepalived/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'haproxy_socket:/var/lib/kolla/haproxy/', 'proxysql_socket:/var/lib/kolla/proxysql/'], 'dimensions': {}}}) 2025-09-27 00:49:34.387530 | orchestrator | 2025-09-27 00:49:34.387541 | orchestrator | TASK [include_role : aodh] ***************************************************** 2025-09-27 00:49:34.387553 | orchestrator | Saturday 27 September 2025 00:45:16 +0000 (0:00:02.982) 0:01:09.655 **** 2025-09-27 00:49:34.387564 | orchestrator | included: aodh for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.387574 | orchestrator | 2025-09-27 00:49:34.387585 | orchestrator | TASK [haproxy-config : Copying over aodh haproxy config] *********************** 2025-09-27 00:49:34.387596 | orchestrator | Saturday 27 September 2025 00:45:16 +0000 (0:00:00.635) 0:01:10.290 **** 2025-09-27 00:49:34.387616 | orchestrator | changed: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-09-27 00:49:34.387633 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.387652 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387664 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387676 | orchestrator | changed: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-09-27 00:49:34.387688 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.387706 | orchestrator | changed: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}}) 2025-09-27 00:49:34.387718 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387740 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.387752 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387763 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387775 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387786 | orchestrator | 2025-09-27 00:49:34.387796 | orchestrator | TASK [haproxy-config : Add configuration for aodh when using single external frontend] *** 2025-09-27 00:49:34.387807 | orchestrator | Saturday 27 September 2025 00:45:20 +0000 (0:00:03.257) 0:01:13.548 **** 2025-09-27 00:49:34.387825 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-09-27 00:49:34.387837 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.387860 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387872 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387883 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.387895 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-09-27 00:49:34.387906 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.387917 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387935 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.387953 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.387969 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-api', 'value': {'container_name': 'aodh_api', 'group': 'aodh-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-api:2024.2', 'volumes': ['/etc/kolla/aodh-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'aodh:/var/lib/aodh/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8042'], 'timeout': '30'}, 'haproxy': {'aodh_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}, 'aodh_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}}}})  2025-09-27 00:49:34.387981 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-evaluator', 'value': {'container_name': 'aodh_evaluator', 'group': 'aodh-evaluator', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-evaluator:2024.2', 'volumes': ['/etc/kolla/aodh-evaluator/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-evaluator 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.387992 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-listener', 'value': {'container_name': 'aodh_listener', 'group': 'aodh-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-listener:2024.2', 'volumes': ['/etc/kolla/aodh-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388004 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh-notifier', 'value': {'container_name': 'aodh_notifier', 'group': 'aodh-notifier', 'enabled': True, 'image': 'registry.osism.tech/kolla/aodh-notifier:2024.2', 'volumes': ['/etc/kolla/aodh-notifier/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port aodh-notifier 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388015 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.388026 | orchestrator | 2025-09-27 00:49:34.388037 | orchestrator | TASK [haproxy-config : Configuring firewall for aodh] ************************** 2025-09-27 00:49:34.388048 | orchestrator | Saturday 27 September 2025 00:45:21 +0000 (0:00:00.839) 0:01:14.388 **** 2025-09-27 00:49:34.388059 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-09-27 00:49:34.388071 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-09-27 00:49:34.388083 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.388094 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-09-27 00:49:34.388105 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-09-27 00:49:34.388123 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.388142 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8042', 'listen_port': '8042'}})  2025-09-27 00:49:34.388153 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'aodh_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8042', 'listen_port': '8042'}})  2025-09-27 00:49:34.388164 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.388175 | orchestrator | 2025-09-27 00:49:34.388186 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL users config] *************** 2025-09-27 00:49:34.388214 | orchestrator | Saturday 27 September 2025 00:45:21 +0000 (0:00:00.913) 0:01:15.302 **** 2025-09-27 00:49:34.388225 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.388236 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.388247 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.388258 | orchestrator | 2025-09-27 00:49:34.388268 | orchestrator | TASK [proxysql-config : Copying over aodh ProxySQL rules config] *************** 2025-09-27 00:49:34.388279 | orchestrator | Saturday 27 September 2025 00:45:23 +0000 (0:00:01.225) 0:01:16.528 **** 2025-09-27 00:49:34.388290 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.388301 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.388311 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.388322 | orchestrator | 2025-09-27 00:49:34.388338 | orchestrator | TASK [include_role : barbican] ************************************************* 2025-09-27 00:49:34.388349 | orchestrator | Saturday 27 September 2025 00:45:24 +0000 (0:00:01.819) 0:01:18.347 **** 2025-09-27 00:49:34.388360 | orchestrator | included: barbican for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.388371 | orchestrator | 2025-09-27 00:49:34.388382 | orchestrator | TASK [haproxy-config : Copying over barbican haproxy config] ******************* 2025-09-27 00:49:34.388392 | orchestrator | Saturday 27 September 2025 00:45:25 +0000 (0:00:00.721) 0:01:19.069 **** 2025-09-27 00:49:34.388404 | orchestrator | changed: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.388416 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388428 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388457 | orchestrator | changed: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.388474 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388486 | orchestrator | changed: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.388498 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388509 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388532 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388544 | orchestrator | 2025-09-27 00:49:34.388555 | orchestrator | TASK [haproxy-config : Add configuration for barbican when using single external frontend] *** 2025-09-27 00:49:34.388566 | orchestrator | Saturday 27 September 2025 00:45:28 +0000 (0:00:03.127) 0:01:22.197 **** 2025-09-27 00:49:34.388584 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.388601 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388613 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388624 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.388635 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.388657 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388674 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.388685 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.390760 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-api', 'value': {'container_name': 'barbican_api', 'group': 'barbican-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-api:2024.2', 'volumes': ['/etc/kolla/barbican-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'barbican:/var/lib/barbican/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9311'], 'timeout': '30'}, 'haproxy': {'barbican_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}, 'barbican_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.390778 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-keystone-listener', 'value': {'container_name': 'barbican_keystone_listener', 'group': 'barbican-keystone-listener', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-keystone-listener:2024.2', 'volumes': ['/etc/kolla/barbican-keystone-listener/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-keystone-listener 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.390787 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican-worker', 'value': {'container_name': 'barbican_worker', 'group': 'barbican-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/barbican-worker:2024.2', 'volumes': ['/etc/kolla/barbican-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port barbican-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.390795 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.390803 | orchestrator | 2025-09-27 00:49:34.390811 | orchestrator | TASK [haproxy-config : Configuring firewall for barbican] ********************** 2025-09-27 00:49:34.390819 | orchestrator | Saturday 27 September 2025 00:45:29 +0000 (0:00:00.612) 0:01:22.809 **** 2025-09-27 00:49:34.390834 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-27 00:49:34.390843 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-27 00:49:34.390851 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.390859 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-27 00:49:34.390867 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-27 00:49:34.390875 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.390883 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-27 00:49:34.390891 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'barbican_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9311', 'listen_port': '9311', 'tls_backend': 'no'}})  2025-09-27 00:49:34.390898 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.390906 | orchestrator | 2025-09-27 00:49:34.390914 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL users config] *********** 2025-09-27 00:49:34.390922 | orchestrator | Saturday 27 September 2025 00:45:30 +0000 (0:00:00.896) 0:01:23.706 **** 2025-09-27 00:49:34.390930 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.390938 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.390945 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.390953 | orchestrator | 2025-09-27 00:49:34.390961 | orchestrator | TASK [proxysql-config : Copying over barbican ProxySQL rules config] *********** 2025-09-27 00:49:34.390968 | orchestrator | Saturday 27 September 2025 00:45:31 +0000 (0:00:01.427) 0:01:25.133 **** 2025-09-27 00:49:34.390976 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.390984 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.390992 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.390999 | orchestrator | 2025-09-27 00:49:34.391024 | orchestrator | TASK [include_role : blazar] *************************************************** 2025-09-27 00:49:34.391033 | orchestrator | Saturday 27 September 2025 00:45:33 +0000 (0:00:01.993) 0:01:27.126 **** 2025-09-27 00:49:34.391041 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.391049 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.391057 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.391064 | orchestrator | 2025-09-27 00:49:34.391072 | orchestrator | TASK [include_role : ceph-rgw] ************************************************* 2025-09-27 00:49:34.391080 | orchestrator | Saturday 27 September 2025 00:45:34 +0000 (0:00:00.299) 0:01:27.425 **** 2025-09-27 00:49:34.391088 | orchestrator | included: ceph-rgw for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.391096 | orchestrator | 2025-09-27 00:49:34.391107 | orchestrator | TASK [haproxy-config : Copying over ceph-rgw haproxy config] ******************* 2025-09-27 00:49:34.391115 | orchestrator | Saturday 27 September 2025 00:45:34 +0000 (0:00:00.789) 0:01:28.214 **** 2025-09-27 00:49:34.391124 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-09-27 00:49:34.391138 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-09-27 00:49:34.391146 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}}) 2025-09-27 00:49:34.391155 | orchestrator | 2025-09-27 00:49:34.391162 | orchestrator | TASK [haproxy-config : Add configuration for ceph-rgw when using single external frontend] *** 2025-09-27 00:49:34.391170 | orchestrator | Saturday 27 September 2025 00:45:37 +0000 (0:00:02.454) 0:01:30.669 **** 2025-09-27 00:49:34.391193 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-09-27 00:49:34.391223 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.391235 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-09-27 00:49:34.391249 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.391257 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ceph-rgw', 'value': {'group': 'all', 'enabled': True, 'haproxy': {'radosgw': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}, 'radosgw_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}}}})  2025-09-27 00:49:34.391265 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.391273 | orchestrator | 2025-09-27 00:49:34.391281 | orchestrator | TASK [haproxy-config : Configuring firewall for ceph-rgw] ********************** 2025-09-27 00:49:34.391289 | orchestrator | Saturday 27 September 2025 00:45:38 +0000 (0:00:01.404) 0:01:32.074 **** 2025-09-27 00:49:34.391297 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-27 00:49:34.391306 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-27 00:49:34.391314 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.391322 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-27 00:49:34.391331 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-27 00:49:34.391339 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.391362 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-27 00:49:34.391371 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'radosgw_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6780', 'custom_member_list': ['server testbed-node-3 192.168.16.13:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-4 192.168.16.14:8081 check inter 2000 rise 2 fall 5', 'server testbed-node-5 192.168.16.15:8081 check inter 2000 rise 2 fall 5']}})  2025-09-27 00:49:34.391387 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.391396 | orchestrator | 2025-09-27 00:49:34.391403 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL users config] *********** 2025-09-27 00:49:34.391411 | orchestrator | Saturday 27 September 2025 00:45:40 +0000 (0:00:01.623) 0:01:33.698 **** 2025-09-27 00:49:34.391419 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.391427 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.391434 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.391442 | orchestrator | 2025-09-27 00:49:34.391450 | orchestrator | TASK [proxysql-config : Copying over ceph-rgw ProxySQL rules config] *********** 2025-09-27 00:49:34.391458 | orchestrator | Saturday 27 September 2025 00:45:41 +0000 (0:00:00.677) 0:01:34.376 **** 2025-09-27 00:49:34.391465 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.391473 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.391481 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.391488 | orchestrator | 2025-09-27 00:49:34.391496 | orchestrator | TASK [include_role : cinder] *************************************************** 2025-09-27 00:49:34.391504 | orchestrator | Saturday 27 September 2025 00:45:42 +0000 (0:00:01.151) 0:01:35.527 **** 2025-09-27 00:49:34.391512 | orchestrator | included: cinder for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.391519 | orchestrator | 2025-09-27 00:49:34.391527 | orchestrator | TASK [haproxy-config : Copying over cinder haproxy config] ********************* 2025-09-27 00:49:34.391535 | orchestrator | Saturday 27 September 2025 00:45:42 +0000 (0:00:00.719) 0:01:36.247 **** 2025-09-27 00:49:34.391543 | orchestrator | changed: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.391552 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391560 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391588 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391610 | orchestrator | changed: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.391619 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391627 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391636 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391659 | orchestrator | changed: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.391677 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391686 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391694 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391702 | orchestrator | 2025-09-27 00:49:34.391710 | orchestrator | TASK [haproxy-config : Add configuration for cinder when using single external frontend] *** 2025-09-27 00:49:34.391718 | orchestrator | Saturday 27 September 2025 00:45:47 +0000 (0:00:04.366) 0:01:40.613 **** 2025-09-27 00:49:34.391726 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.391752 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.391765 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391774 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391782 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391791 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391799 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391815 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.391836 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391844 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.391856 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-api', 'value': {'container_name': 'cinder_api', 'group': 'cinder-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-api:2024.2', 'volumes': ['/etc/kolla/cinder-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8776'], 'timeout': '30'}, 'haproxy': {'cinder_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}, 'cinder_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.391865 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-scheduler', 'value': {'container_name': 'cinder_scheduler', 'group': 'cinder-scheduler', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-scheduler:2024.2', 'volumes': ['/etc/kolla/cinder-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391873 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-volume', 'value': {'container_name': 'cinder_volume', 'group': 'cinder-volume', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-volume:2024.2', 'privileged': True, 'ipc_mode': 'host', 'tmpfs': [''], 'volumes': ['/etc/kolla/cinder-volume/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', '', 'kolla_logs:/var/log/kolla/', '', '/opt/cinder-driver-dm-clone:/var/lib/kolla/venv/lib/python3/site-packages/cinder-driver-dm-clone'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-volume 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391882 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder-backup', 'value': {'container_name': 'cinder_backup', 'group': 'cinder-backup', 'enabled': True, 'image': 'registry.osism.tech/kolla/cinder-backup:2024.2', 'privileged': True, 'volumes': ['/etc/kolla/cinder-backup/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/lib/modules:/lib/modules:ro', '/run:/run:shared', 'cinder:/var/lib/cinder', 'iscsi_info:/etc/iscsi', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port cinder-backup 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.391895 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.391903 | orchestrator | 2025-09-27 00:49:34.391911 | orchestrator | TASK [haproxy-config : Configuring firewall for cinder] ************************ 2025-09-27 00:49:34.391919 | orchestrator | Saturday 27 September 2025 00:45:48 +0000 (0:00:00.772) 0:01:41.386 **** 2025-09-27 00:49:34.391927 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-27 00:49:34.391947 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-27 00:49:34.391956 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.391964 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-27 00:49:34.391976 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-27 00:49:34.391984 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.391992 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-27 00:49:34.392000 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'cinder_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8776', 'listen_port': '8776', 'tls_backend': 'no'}})  2025-09-27 00:49:34.392008 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.392016 | orchestrator | 2025-09-27 00:49:34.392023 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL users config] ************* 2025-09-27 00:49:34.392031 | orchestrator | Saturday 27 September 2025 00:45:48 +0000 (0:00:00.755) 0:01:42.141 **** 2025-09-27 00:49:34.392039 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.392047 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.392054 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.392062 | orchestrator | 2025-09-27 00:49:34.392070 | orchestrator | TASK [proxysql-config : Copying over cinder ProxySQL rules config] ************* 2025-09-27 00:49:34.392078 | orchestrator | Saturday 27 September 2025 00:45:49 +0000 (0:00:01.161) 0:01:43.303 **** 2025-09-27 00:49:34.392085 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.392093 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.392101 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.392109 | orchestrator | 2025-09-27 00:49:34.392116 | orchestrator | TASK [include_role : cloudkitty] *********************************************** 2025-09-27 00:49:34.392124 | orchestrator | Saturday 27 September 2025 00:45:51 +0000 (0:00:01.842) 0:01:45.145 **** 2025-09-27 00:49:34.392132 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.392139 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.392147 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.392155 | orchestrator | 2025-09-27 00:49:34.392163 | orchestrator | TASK [include_role : cyborg] *************************************************** 2025-09-27 00:49:34.392170 | orchestrator | Saturday 27 September 2025 00:45:52 +0000 (0:00:00.396) 0:01:45.542 **** 2025-09-27 00:49:34.392183 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.392191 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.392209 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.392218 | orchestrator | 2025-09-27 00:49:34.392225 | orchestrator | TASK [include_role : designate] ************************************************ 2025-09-27 00:49:34.392233 | orchestrator | Saturday 27 September 2025 00:45:52 +0000 (0:00:00.291) 0:01:45.833 **** 2025-09-27 00:49:34.392241 | orchestrator | included: designate for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.392249 | orchestrator | 2025-09-27 00:49:34.392256 | orchestrator | TASK [haproxy-config : Copying over designate haproxy config] ****************** 2025-09-27 00:49:34.392264 | orchestrator | Saturday 27 September 2025 00:45:53 +0000 (0:00:00.805) 0:01:46.639 **** 2025-09-27 00:49:34.392272 | orchestrator | changed: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-09-27 00:49:34.392293 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-27 00:49:34.392306 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392315 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392323 | orchestrator | changed: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-09-27 00:49:34.392336 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392345 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392353 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392373 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-27 00:49:34.392386 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392395 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392408 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392416 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392424 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392444 | orchestrator | changed: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}}) 2025-09-27 00:49:34.392457 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-27 00:49:34.392466 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392479 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392487 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392495 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392503 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392511 | orchestrator | 2025-09-27 00:49:34.392519 | orchestrator | TASK [haproxy-config : Add configuration for designate when using single external frontend] *** 2025-09-27 00:49:34.392527 | orchestrator | Saturday 27 September 2025 00:45:58 +0000 (0:00:04.864) 0:01:51.503 **** 2025-09-27 00:49:34.392552 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-09-27 00:49:34.392561 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-27 00:49:34.392574 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392582 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392591 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392599 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392619 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392628 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.392640 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-09-27 00:49:34.392653 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-27 00:49:34.392661 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392669 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392678 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-api', 'value': {'container_name': 'designate_api', 'group': 'designate-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-api:2024.2', 'volumes': ['/etc/kolla/designate-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9001'], 'timeout': '30'}, 'haproxy': {'designate_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}, 'designate_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}}}})  2025-09-27 00:49:34.392697 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392713 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-backend-bind9', 'value': {'container_name': 'designate_backend_bind9', 'group': 'designate-backend-bind9', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-backend-bind9:2024.2', 'volumes': ['/etc/kolla/designate-backend-bind9/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', 'designate_backend_bind9:/var/lib/named/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen named 53'], 'timeout': '30'}}})  2025-09-27 00:49:34.392726 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392734 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-central', 'value': {'container_name': 'designate_central', 'group': 'designate-central', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-central:2024.2', 'volumes': ['/etc/kolla/designate-central/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-central 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392742 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392751 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-mdns', 'value': {'container_name': 'designate_mdns', 'group': 'designate-mdns', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-mdns:2024.2', 'volumes': ['/etc/kolla/designate-mdns/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-mdns 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392758 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.392766 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-producer', 'value': {'container_name': 'designate_producer', 'group': 'designate-producer', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-producer:2024.2', 'volumes': ['/etc/kolla/designate-producer/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-producer 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392786 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-worker', 'value': {'container_name': 'designate_worker', 'group': 'designate-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/designate-worker:2024.2', 'volumes': ['/etc/kolla/designate-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392803 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate-sink', 'value': {'container_name': 'designate_sink', 'group': 'designate-sink', 'enabled': False, 'image': 'registry.osism.tech/kolla/designate-sink:2024.2', 'volumes': ['/etc/kolla/designate-sink/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port designate-sink 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.392812 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.392820 | orchestrator | 2025-09-27 00:49:34.392827 | orchestrator | TASK [haproxy-config : Configuring firewall for designate] ********************* 2025-09-27 00:49:34.392835 | orchestrator | Saturday 27 September 2025 00:45:58 +0000 (0:00:00.780) 0:01:52.284 **** 2025-09-27 00:49:34.392844 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-09-27 00:49:34.392852 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-09-27 00:49:34.392859 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-09-27 00:49:34.392868 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-09-27 00:49:34.392876 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.392884 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.392892 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9001', 'listen_port': '9001'}})  2025-09-27 00:49:34.392899 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'designate_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9001', 'listen_port': '9001'}})  2025-09-27 00:49:34.392907 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.392915 | orchestrator | 2025-09-27 00:49:34.392923 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL users config] ********** 2025-09-27 00:49:34.392931 | orchestrator | Saturday 27 September 2025 00:45:59 +0000 (0:00:00.912) 0:01:53.196 **** 2025-09-27 00:49:34.392938 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.392946 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.392954 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.392962 | orchestrator | 2025-09-27 00:49:34.392969 | orchestrator | TASK [proxysql-config : Copying over designate ProxySQL rules config] ********** 2025-09-27 00:49:34.392977 | orchestrator | Saturday 27 September 2025 00:46:01 +0000 (0:00:01.758) 0:01:54.954 **** 2025-09-27 00:49:34.392985 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.392992 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.393000 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.393008 | orchestrator | 2025-09-27 00:49:34.393016 | orchestrator | TASK [include_role : etcd] ***************************************************** 2025-09-27 00:49:34.393023 | orchestrator | Saturday 27 September 2025 00:46:03 +0000 (0:00:01.969) 0:01:56.924 **** 2025-09-27 00:49:34.393031 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.393039 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.393047 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.393054 | orchestrator | 2025-09-27 00:49:34.393062 | orchestrator | TASK [include_role : glance] *************************************************** 2025-09-27 00:49:34.393070 | orchestrator | Saturday 27 September 2025 00:46:04 +0000 (0:00:00.502) 0:01:57.426 **** 2025-09-27 00:49:34.393082 | orchestrator | included: glance for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.393090 | orchestrator | 2025-09-27 00:49:34.393097 | orchestrator | TASK [haproxy-config : Copying over glance haproxy config] ********************* 2025-09-27 00:49:34.393105 | orchestrator | Saturday 27 September 2025 00:46:04 +0000 (0:00:00.796) 0:01:58.223 **** 2025-09-27 00:49:34.393132 | orchestrator | changed: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-09-27 00:49:34.393143 | orchestrator | changed: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-09-27 00:49:34.393162 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.393177 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.393211 | orchestrator | changed: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}}) 2025-09-27 00:49:34.393230 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.393239 | orchestrator | 2025-09-27 00:49:34.393247 | orchestrator | TASK [haproxy-config : Add configuration for glance when using single external frontend] *** 2025-09-27 00:49:34.393255 | orchestrator | Saturday 27 September 2025 00:46:09 +0000 (0:00:04.178) 0:02:02.401 **** 2025-09-27 00:49:34.393276 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-09-27 00:49:34.393295 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.393304 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.393313 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-09-27 00:49:34.393343 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.393352 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.393361 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-api', 'value': {'container_name': 'glance_api', 'group': 'glance-api', 'host_in_groups': True, 'enabled': True, 'image': 'registry.osism.tech/kolla/glance-api:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'privileged': True, 'volumes': ['/etc/kolla/glance-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'glance:/var/lib/glance/', '', 'kolla_logs:/var/log/kolla/', '', 'iscsi_info:/etc/iscsi', '/dev:/dev'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9292'], 'timeout': '30'}, 'haproxy': {'glance_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}, 'glance_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}}}})  2025-09-27 00:49:34.393393 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance-tls-proxy', 'value': {'container_name': 'glance_tls_proxy', 'group': 'glance-api', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/glance-tls-proxy:2024.2', 'volumes': ['/etc/kolla/glance-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9293'], 'timeout': '30'}, 'haproxy': {'glance_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}, 'glance_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5 ssl verify required ca-file ca-certificates.crt', ''], 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.393403 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.393411 | orchestrator | 2025-09-27 00:49:34.393419 | orchestrator | TASK [haproxy-config : Configuring firewall for glance] ************************ 2025-09-27 00:49:34.393427 | orchestrator | Saturday 27 September 2025 00:46:12 +0000 (0:00:03.391) 0:02:05.792 **** 2025-09-27 00:49:34.393435 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-27 00:49:34.393444 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-27 00:49:34.393452 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.393460 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-27 00:49:34.393474 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-27 00:49:34.393482 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.393490 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-27 00:49:34.393510 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'glance_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9292', 'frontend_http_extra': ['timeout client 6h'], 'backend_http_extra': ['timeout server 6h'], 'custom_member_list': ['server testbed-node-0 192.168.16.10:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-1 192.168.16.11:9292 check inter 2000 rise 2 fall 5', 'server testbed-node-2 192.168.16.12:9292 check inter 2000 rise 2 fall 5', '']}})  2025-09-27 00:49:34.393518 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.393526 | orchestrator | 2025-09-27 00:49:34.393534 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL users config] ************* 2025-09-27 00:49:34.393542 | orchestrator | Saturday 27 September 2025 00:46:16 +0000 (0:00:03.579) 0:02:09.372 **** 2025-09-27 00:49:34.393553 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.393561 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.393569 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.393577 | orchestrator | 2025-09-27 00:49:34.393585 | orchestrator | TASK [proxysql-config : Copying over glance ProxySQL rules config] ************* 2025-09-27 00:49:34.393593 | orchestrator | Saturday 27 September 2025 00:46:17 +0000 (0:00:01.208) 0:02:10.581 **** 2025-09-27 00:49:34.393600 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.393608 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.393616 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.393623 | orchestrator | 2025-09-27 00:49:34.393631 | orchestrator | TASK [include_role : gnocchi] ************************************************** 2025-09-27 00:49:34.393639 | orchestrator | Saturday 27 September 2025 00:46:19 +0000 (0:00:01.786) 0:02:12.367 **** 2025-09-27 00:49:34.393647 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.393654 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.393662 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.393670 | orchestrator | 2025-09-27 00:49:34.393678 | orchestrator | TASK [include_role : grafana] ************************************************** 2025-09-27 00:49:34.393685 | orchestrator | Saturday 27 September 2025 00:46:19 +0000 (0:00:00.424) 0:02:12.791 **** 2025-09-27 00:49:34.393693 | orchestrator | included: grafana for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.393701 | orchestrator | 2025-09-27 00:49:34.393709 | orchestrator | TASK [haproxy-config : Copying over grafana haproxy config] ******************** 2025-09-27 00:49:34.393717 | orchestrator | Saturday 27 September 2025 00:46:20 +0000 (0:00:00.817) 0:02:13.609 **** 2025-09-27 00:49:34.393725 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 00:49:34.393739 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 00:49:34.393747 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 00:49:34.393755 | orchestrator | 2025-09-27 00:49:34.393763 | orchestrator | TASK [haproxy-config : Add configuration for grafana when using single external frontend] *** 2025-09-27 00:49:34.393771 | orchestrator | Saturday 27 September 2025 00:46:24 +0000 (0:00:04.369) 0:02:17.978 **** 2025-09-27 00:49:34.393794 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 00:49:34.393803 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 00:49:34.393811 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.393819 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.393827 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 00:49:34.393841 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.393849 | orchestrator | 2025-09-27 00:49:34.393856 | orchestrator | TASK [haproxy-config : Configuring firewall for grafana] *********************** 2025-09-27 00:49:34.393864 | orchestrator | Saturday 27 September 2025 00:46:25 +0000 (0:00:00.524) 0:02:18.502 **** 2025-09-27 00:49:34.393872 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-09-27 00:49:34.393880 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-09-27 00:49:34.393888 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.393896 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-09-27 00:49:34.393904 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-09-27 00:49:34.393911 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.393928 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}})  2025-09-27 00:49:34.393937 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}})  2025-09-27 00:49:34.393945 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.393952 | orchestrator | 2025-09-27 00:49:34.393960 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL users config] ************ 2025-09-27 00:49:34.393968 | orchestrator | Saturday 27 September 2025 00:46:25 +0000 (0:00:00.616) 0:02:19.118 **** 2025-09-27 00:49:34.393976 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.393983 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.393991 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.393999 | orchestrator | 2025-09-27 00:49:34.394006 | orchestrator | TASK [proxysql-config : Copying over grafana ProxySQL rules config] ************ 2025-09-27 00:49:34.394014 | orchestrator | Saturday 27 September 2025 00:46:27 +0000 (0:00:01.291) 0:02:20.409 **** 2025-09-27 00:49:34.394045 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.394053 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.394061 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.394069 | orchestrator | 2025-09-27 00:49:34.394076 | orchestrator | TASK [include_role : heat] ***************************************************** 2025-09-27 00:49:34.394084 | orchestrator | Saturday 27 September 2025 00:46:28 +0000 (0:00:01.770) 0:02:22.180 **** 2025-09-27 00:49:34.394092 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.394100 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.394120 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.394128 | orchestrator | 2025-09-27 00:49:34.394136 | orchestrator | TASK [include_role : horizon] ************************************************** 2025-09-27 00:49:34.394144 | orchestrator | Saturday 27 September 2025 00:46:29 +0000 (0:00:00.445) 0:02:22.625 **** 2025-09-27 00:49:34.394152 | orchestrator | included: horizon for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.394160 | orchestrator | 2025-09-27 00:49:34.394167 | orchestrator | TASK [haproxy-config : Copying over horizon haproxy config] ******************** 2025-09-27 00:49:34.394175 | orchestrator | Saturday 27 September 2025 00:46:30 +0000 (0:00:00.809) 0:02:23.434 **** 2025-09-27 00:49:34.394193 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:49:34.394253 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:49:34.394271 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:49:34.394280 | orchestrator | 2025-09-27 00:49:34.394288 | orchestrator | TASK [haproxy-config : Add configuration for horizon when using single external frontend] *** 2025-09-27 00:49:34.394296 | orchestrator | Saturday 27 September 2025 00:46:34 +0000 (0:00:03.974) 0:02:27.409 **** 2025-09-27 00:49:34.394315 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:49:34.394330 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.394339 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:49:34.394347 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.394366 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:49:34.394380 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.394388 | orchestrator | 2025-09-27 00:49:34.394396 | orchestrator | TASK [haproxy-config : Configuring firewall for horizon] *********************** 2025-09-27 00:49:34.394404 | orchestrator | Saturday 27 September 2025 00:46:35 +0000 (0:00:01.306) 0:02:28.715 **** 2025-09-27 00:49:34.394412 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-27 00:49:34.394420 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-27 00:49:34.394429 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-27 00:49:34.394437 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-27 00:49:34.394445 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-27 00:49:34.394454 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-09-27 00:49:34.394461 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.394470 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-27 00:49:34.394478 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-27 00:49:34.394495 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-27 00:49:34.394503 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-09-27 00:49:34.394511 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.394526 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-27 00:49:34.394535 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-27 00:49:34.394543 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}})  2025-09-27 00:49:34.394551 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon_external_redirect', 'value': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}})  2025-09-27 00:49:34.394559 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'acme_client', 'value': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}})  2025-09-27 00:49:34.394567 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.394575 | orchestrator | 2025-09-27 00:49:34.394583 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL users config] ************ 2025-09-27 00:49:34.394591 | orchestrator | Saturday 27 September 2025 00:46:36 +0000 (0:00:01.277) 0:02:29.993 **** 2025-09-27 00:49:34.394599 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.394607 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.394615 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.394622 | orchestrator | 2025-09-27 00:49:34.394630 | orchestrator | TASK [proxysql-config : Copying over horizon ProxySQL rules config] ************ 2025-09-27 00:49:34.394638 | orchestrator | Saturday 27 September 2025 00:46:37 +0000 (0:00:01.323) 0:02:31.316 **** 2025-09-27 00:49:34.394646 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.394654 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.394661 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.394669 | orchestrator | 2025-09-27 00:49:34.394677 | orchestrator | TASK [include_role : influxdb] ************************************************* 2025-09-27 00:49:34.394685 | orchestrator | Saturday 27 September 2025 00:46:40 +0000 (0:00:02.099) 0:02:33.416 **** 2025-09-27 00:49:34.394692 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.394700 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.394708 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.394716 | orchestrator | 2025-09-27 00:49:34.394724 | orchestrator | TASK [include_role : ironic] *************************************************** 2025-09-27 00:49:34.394736 | orchestrator | Saturday 27 September 2025 00:46:40 +0000 (0:00:00.365) 0:02:33.781 **** 2025-09-27 00:49:34.394744 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.394752 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.394760 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.394768 | orchestrator | 2025-09-27 00:49:34.394775 | orchestrator | TASK [include_role : keystone] ************************************************* 2025-09-27 00:49:34.394783 | orchestrator | Saturday 27 September 2025 00:46:40 +0000 (0:00:00.526) 0:02:34.308 **** 2025-09-27 00:49:34.394791 | orchestrator | included: keystone for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.394799 | orchestrator | 2025-09-27 00:49:34.394807 | orchestrator | TASK [haproxy-config : Copying over keystone haproxy config] ******************* 2025-09-27 00:49:34.394815 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:01.006) 0:02:35.314 **** 2025-09-27 00:49:34.394837 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:49:34.394851 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:49:34.394860 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:49:34.394869 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:49:34.394883 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:49:34.394892 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:49:34.394909 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:49:34.394918 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:49:34.394927 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:49:34.394935 | orchestrator | 2025-09-27 00:49:34.394943 | orchestrator | TASK [haproxy-config : Add configuration for keystone when using single external frontend] *** 2025-09-27 00:49:34.394951 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:04.071) 0:02:39.386 **** 2025-09-27 00:49:34.394960 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:49:34.394973 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:49:34.394996 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:49:34.395005 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.395017 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:49:34.395026 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:49:34.395034 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:49:34.395047 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.395056 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:49:34.395077 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:49:34.395090 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:49:34.395098 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.395106 | orchestrator | 2025-09-27 00:49:34.395114 | orchestrator | TASK [haproxy-config : Configuring firewall for keystone] ********************** 2025-09-27 00:49:34.395122 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:00.806) 0:02:40.192 **** 2025-09-27 00:49:34.395130 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-27 00:49:34.395139 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-27 00:49:34.395147 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-27 00:49:34.395162 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.395170 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-27 00:49:34.395178 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.395186 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_internal', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-27 00:49:34.395195 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}})  2025-09-27 00:49:34.395238 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.395246 | orchestrator | 2025-09-27 00:49:34.395254 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL users config] *********** 2025-09-27 00:49:34.395262 | orchestrator | Saturday 27 September 2025 00:46:47 +0000 (0:00:00.954) 0:02:41.147 **** 2025-09-27 00:49:34.395270 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.395278 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.395285 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.395293 | orchestrator | 2025-09-27 00:49:34.395301 | orchestrator | TASK [proxysql-config : Copying over keystone ProxySQL rules config] *********** 2025-09-27 00:49:34.395309 | orchestrator | Saturday 27 September 2025 00:46:49 +0000 (0:00:01.210) 0:02:42.358 **** 2025-09-27 00:49:34.395317 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.395324 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.395332 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.395340 | orchestrator | 2025-09-27 00:49:34.395348 | orchestrator | TASK [include_role : letsencrypt] ********************************************** 2025-09-27 00:49:34.395356 | orchestrator | Saturday 27 September 2025 00:46:51 +0000 (0:00:02.038) 0:02:44.396 **** 2025-09-27 00:49:34.395363 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.395371 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.395379 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.395387 | orchestrator | 2025-09-27 00:49:34.395394 | orchestrator | TASK [include_role : magnum] *************************************************** 2025-09-27 00:49:34.395402 | orchestrator | Saturday 27 September 2025 00:46:51 +0000 (0:00:00.418) 0:02:44.815 **** 2025-09-27 00:49:34.395410 | orchestrator | included: magnum for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.395418 | orchestrator | 2025-09-27 00:49:34.395426 | orchestrator | TASK [haproxy-config : Copying over magnum haproxy config] ********************* 2025-09-27 00:49:34.395434 | orchestrator | Saturday 27 September 2025 00:46:52 +0000 (0:00:00.896) 0:02:45.712 **** 2025-09-27 00:49:34.395483 | orchestrator | changed: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-09-27 00:49:34.395498 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395507 | orchestrator | changed: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-09-27 00:49:34.395516 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395524 | orchestrator | changed: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}}) 2025-09-27 00:49:34.395541 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395556 | orchestrator | 2025-09-27 00:49:34.395564 | orchestrator | TASK [haproxy-config : Add configuration for magnum when using single external frontend] *** 2025-09-27 00:49:34.395571 | orchestrator | Saturday 27 September 2025 00:46:56 +0000 (0:00:03.852) 0:02:49.564 **** 2025-09-27 00:49:34.395580 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-09-27 00:49:34.395588 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-09-27 00:49:34.395596 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.12,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395605 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.395626 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.10,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395635 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.395648 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-api', 'value': {'container_name': 'magnum_api', 'group': 'magnum-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-api:2024.2', 'environment': {'DUMMY_ENVIRONMENT': 'kolla_useless_env'}, 'volumes': ['/etc/kolla/magnum-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9511'], 'timeout': '30'}, 'haproxy': {'magnum_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}, 'magnum_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}}}})  2025-09-27 00:49:34.395664 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum-conductor', 'value': {'container_name': 'magnum_conductor', 'group': 'magnum-conductor', 'enabled': True, 'image': 'registry.osism.tech/kolla/magnum-conductor:2024.2', 'environment': {'http_proxy': '', 'https_proxy': '', 'no_proxy': 'localhost,127.0.0.1,192.168.16.11,192.168.16.9'}, 'volumes': ['/etc/kolla/magnum-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'magnum:/var/lib/magnum/', '', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port magnum-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395672 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.395680 | orchestrator | 2025-09-27 00:49:34.395688 | orchestrator | TASK [haproxy-config : Configuring firewall for magnum] ************************ 2025-09-27 00:49:34.395696 | orchestrator | Saturday 27 September 2025 00:46:57 +0000 (0:00:01.249) 0:02:50.814 **** 2025-09-27 00:49:34.395704 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-09-27 00:49:34.395712 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-09-27 00:49:34.395720 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.395728 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-09-27 00:49:34.395736 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-09-27 00:49:34.395744 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.395752 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9511', 'listen_port': '9511'}})  2025-09-27 00:49:34.395759 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'magnum_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9511', 'listen_port': '9511'}})  2025-09-27 00:49:34.395766 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.395773 | orchestrator | 2025-09-27 00:49:34.395779 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL users config] ************* 2025-09-27 00:49:34.395786 | orchestrator | Saturday 27 September 2025 00:46:59 +0000 (0:00:01.641) 0:02:52.456 **** 2025-09-27 00:49:34.395792 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.395799 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.395805 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.395812 | orchestrator | 2025-09-27 00:49:34.395818 | orchestrator | TASK [proxysql-config : Copying over magnum ProxySQL rules config] ************* 2025-09-27 00:49:34.395825 | orchestrator | Saturday 27 September 2025 00:47:00 +0000 (0:00:01.797) 0:02:54.254 **** 2025-09-27 00:49:34.395832 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.395838 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.395849 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.395856 | orchestrator | 2025-09-27 00:49:34.395863 | orchestrator | TASK [include_role : manila] *************************************************** 2025-09-27 00:49:34.395869 | orchestrator | Saturday 27 September 2025 00:47:02 +0000 (0:00:02.033) 0:02:56.288 **** 2025-09-27 00:49:34.395886 | orchestrator | included: manila for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.395893 | orchestrator | 2025-09-27 00:49:34.395900 | orchestrator | TASK [haproxy-config : Copying over manila haproxy config] ********************* 2025-09-27 00:49:34.395906 | orchestrator | Saturday 27 September 2025 00:47:04 +0000 (0:00:01.192) 0:02:57.480 **** 2025-09-27 00:49:34.395917 | orchestrator | changed: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-09-27 00:49:34.395924 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395931 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395939 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395946 | orchestrator | changed: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-09-27 00:49:34.395969 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395980 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395987 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.395994 | orchestrator | changed: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}}) 2025-09-27 00:49:34.396001 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396008 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396024 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396031 | orchestrator | 2025-09-27 00:49:34.396038 | orchestrator | TASK [haproxy-config : Add configuration for manila when using single external frontend] *** 2025-09-27 00:49:34.396045 | orchestrator | Saturday 27 September 2025 00:47:08 +0000 (0:00:04.563) 0:03:02.044 **** 2025-09-27 00:49:34.396054 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-09-27 00:49:34.396062 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396069 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396076 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396082 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.396090 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-09-27 00:49:34.396114 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396125 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396132 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396139 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.396146 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-api', 'value': {'container_name': 'manila_api', 'group': 'manila-api', 'image': 'registry.osism.tech/kolla/manila-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8786'], 'timeout': '30'}, 'haproxy': {'manila_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}, 'manila_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}}}})  2025-09-27 00:49:34.396153 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-scheduler', 'value': {'container_name': 'manila_scheduler', 'group': 'manila-scheduler', 'image': 'registry.osism.tech/kolla/manila-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/manila-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396165 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-share', 'value': {'container_name': 'manila_share', 'group': 'manila-share', 'image': 'registry.osism.tech/kolla/manila-share:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-share/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', '', '/lib/modules:/lib/modules:ro', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-share 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396175 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila-data', 'value': {'container_name': 'manila_data', 'group': 'manila-data', 'image': 'registry.osism.tech/kolla/manila-data:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/manila-data/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/dev/:/dev/', '/run:/run:shared', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port manila-data 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.396182 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.396189 | orchestrator | 2025-09-27 00:49:34.396196 | orchestrator | TASK [haproxy-config : Configuring firewall for manila] ************************ 2025-09-27 00:49:34.396239 | orchestrator | Saturday 27 September 2025 00:47:09 +0000 (0:00:00.876) 0:03:02.920 **** 2025-09-27 00:49:34.396246 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-09-27 00:49:34.396256 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-09-27 00:49:34.396263 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.396270 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-09-27 00:49:34.396277 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-09-27 00:49:34.396284 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.396291 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '8786', 'listen_port': '8786'}})  2025-09-27 00:49:34.396297 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'manila_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8786', 'listen_port': '8786'}})  2025-09-27 00:49:34.396304 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.396311 | orchestrator | 2025-09-27 00:49:34.396317 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL users config] ************* 2025-09-27 00:49:34.396324 | orchestrator | Saturday 27 September 2025 00:47:10 +0000 (0:00:01.217) 0:03:04.138 **** 2025-09-27 00:49:34.396331 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.396338 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.396344 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.396351 | orchestrator | 2025-09-27 00:49:34.396357 | orchestrator | TASK [proxysql-config : Copying over manila ProxySQL rules config] ************* 2025-09-27 00:49:34.396364 | orchestrator | Saturday 27 September 2025 00:47:12 +0000 (0:00:01.261) 0:03:05.400 **** 2025-09-27 00:49:34.396375 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.396382 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.396389 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.396395 | orchestrator | 2025-09-27 00:49:34.396402 | orchestrator | TASK [include_role : mariadb] ************************************************** 2025-09-27 00:49:34.396409 | orchestrator | Saturday 27 September 2025 00:47:13 +0000 (0:00:01.902) 0:03:07.302 **** 2025-09-27 00:49:34.396416 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.396422 | orchestrator | 2025-09-27 00:49:34.396429 | orchestrator | TASK [mariadb : Ensure mysql monitor user exist] ******************************* 2025-09-27 00:49:34.396436 | orchestrator | Saturday 27 September 2025 00:47:15 +0000 (0:00:01.280) 0:03:08.583 **** 2025-09-27 00:49:34.396442 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-27 00:49:34.396449 | orchestrator | 2025-09-27 00:49:34.396456 | orchestrator | TASK [haproxy-config : Copying over mariadb haproxy config] ******************** 2025-09-27 00:49:34.396462 | orchestrator | Saturday 27 September 2025 00:47:16 +0000 (0:00:01.467) 0:03:10.051 **** 2025-09-27 00:49:34.396482 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:49:34.396495 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-27 00:49:34.396502 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.396509 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:49:34.396521 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-27 00:49:34.396528 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.396544 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:49:34.396553 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-27 00:49:34.396566 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.396573 | orchestrator | 2025-09-27 00:49:34.396580 | orchestrator | TASK [haproxy-config : Add configuration for mariadb when using single external frontend] *** 2025-09-27 00:49:34.396587 | orchestrator | Saturday 27 September 2025 00:47:19 +0000 (0:00:02.765) 0:03:12.817 **** 2025-09-27 00:49:34.396594 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:49:34.396614 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-27 00:49:34.396621 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.396632 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:49:34.396646 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-27 00:49:34.396653 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.396675 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:49:34.396684 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb-clustercheck', 'value': {'container_name': 'mariadb_clustercheck', 'group': 'mariadb_shard_0', 'enabled': False, 'image': 'registry.osism.tech/kolla/mariadb-clustercheck:2024.2', 'volumes': ['/etc/kolla/mariadb-clustercheck/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}}})  2025-09-27 00:49:34.396696 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.396703 | orchestrator | 2025-09-27 00:49:34.396709 | orchestrator | TASK [haproxy-config : Configuring firewall for mariadb] *********************** 2025-09-27 00:49:34.396716 | orchestrator | Saturday 27 September 2025 00:47:22 +0000 (0:00:02.613) 0:03:15.430 **** 2025-09-27 00:49:34.396723 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-27 00:49:34.396730 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-27 00:49:34.396737 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.396744 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-27 00:49:34.396751 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-27 00:49:34.396758 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.396776 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-27 00:49:34.396784 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb_external_lb', 'value': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}})  2025-09-27 00:49:34.396794 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.396801 | orchestrator | 2025-09-27 00:49:34.396808 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL users config] ************ 2025-09-27 00:49:34.396814 | orchestrator | Saturday 27 September 2025 00:47:24 +0000 (0:00:02.273) 0:03:17.703 **** 2025-09-27 00:49:34.396821 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.396828 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.396834 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.396841 | orchestrator | 2025-09-27 00:49:34.396847 | orchestrator | TASK [proxysql-config : Copying over mariadb ProxySQL rules config] ************ 2025-09-27 00:49:34.396854 | orchestrator | Saturday 27 September 2025 00:47:26 +0000 (0:00:01.662) 0:03:19.366 **** 2025-09-27 00:49:34.396861 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.396867 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.396874 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.396880 | orchestrator | 2025-09-27 00:49:34.396887 | orchestrator | TASK [include_role : masakari] ************************************************* 2025-09-27 00:49:34.396893 | orchestrator | Saturday 27 September 2025 00:47:27 +0000 (0:00:01.183) 0:03:20.549 **** 2025-09-27 00:49:34.396900 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.396906 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.396913 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.396919 | orchestrator | 2025-09-27 00:49:34.396926 | orchestrator | TASK [include_role : memcached] ************************************************ 2025-09-27 00:49:34.396933 | orchestrator | Saturday 27 September 2025 00:47:27 +0000 (0:00:00.280) 0:03:20.830 **** 2025-09-27 00:49:34.396939 | orchestrator | included: memcached for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.396946 | orchestrator | 2025-09-27 00:49:34.396952 | orchestrator | TASK [haproxy-config : Copying over memcached haproxy config] ****************** 2025-09-27 00:49:34.396968 | orchestrator | Saturday 27 September 2025 00:47:28 +0000 (0:00:01.152) 0:03:21.982 **** 2025-09-27 00:49:34.396975 | orchestrator | changed: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-09-27 00:49:34.396983 | orchestrator | changed: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-09-27 00:49:34.397002 | orchestrator | changed: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}}) 2025-09-27 00:49:34.397014 | orchestrator | 2025-09-27 00:49:34.397024 | orchestrator | TASK [haproxy-config : Add configuration for memcached when using single external frontend] *** 2025-09-27 00:49:34.397031 | orchestrator | Saturday 27 September 2025 00:47:30 +0000 (0:00:01.432) 0:03:23.415 **** 2025-09-27 00:49:34.397038 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-09-27 00:49:34.397045 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-09-27 00:49:34.397052 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.397059 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.397065 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'container_name': 'memcached', 'image': 'registry.osism.tech/kolla/memcached:2024.2', 'enabled': True, 'group': 'memcached', 'volumes': ['/etc/kolla/memcached/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen memcached 11211'], 'timeout': '30'}, 'haproxy': {'memcached': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}}}})  2025-09-27 00:49:34.397072 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.397079 | orchestrator | 2025-09-27 00:49:34.397086 | orchestrator | TASK [haproxy-config : Configuring firewall for memcached] ********************* 2025-09-27 00:49:34.397092 | orchestrator | Saturday 27 September 2025 00:47:30 +0000 (0:00:00.338) 0:03:23.754 **** 2025-09-27 00:49:34.397099 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-09-27 00:49:34.397106 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.397113 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-09-27 00:49:34.397125 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.397143 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'memcached', 'value': {'enabled': False, 'mode': 'tcp', 'port': '11211', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'active_passive': True}})  2025-09-27 00:49:34.397150 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.397157 | orchestrator | 2025-09-27 00:49:34.397164 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL users config] ********** 2025-09-27 00:49:34.397170 | orchestrator | Saturday 27 September 2025 00:47:31 +0000 (0:00:00.678) 0:03:24.433 **** 2025-09-27 00:49:34.397177 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.397183 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.397190 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.397196 | orchestrator | 2025-09-27 00:49:34.397213 | orchestrator | TASK [proxysql-config : Copying over memcached ProxySQL rules config] ********** 2025-09-27 00:49:34.397220 | orchestrator | Saturday 27 September 2025 00:47:31 +0000 (0:00:00.397) 0:03:24.831 **** 2025-09-27 00:49:34.397226 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.397236 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.397243 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.397250 | orchestrator | 2025-09-27 00:49:34.397256 | orchestrator | TASK [include_role : mistral] ************************************************** 2025-09-27 00:49:34.397263 | orchestrator | Saturday 27 September 2025 00:47:32 +0000 (0:00:01.090) 0:03:25.922 **** 2025-09-27 00:49:34.397269 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.397276 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.397283 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.397289 | orchestrator | 2025-09-27 00:49:34.397296 | orchestrator | TASK [include_role : neutron] ************************************************** 2025-09-27 00:49:34.397302 | orchestrator | Saturday 27 September 2025 00:47:32 +0000 (0:00:00.267) 0:03:26.189 **** 2025-09-27 00:49:34.397309 | orchestrator | included: neutron for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.397315 | orchestrator | 2025-09-27 00:49:34.397322 | orchestrator | TASK [haproxy-config : Copying over neutron haproxy config] ******************** 2025-09-27 00:49:34.397328 | orchestrator | Saturday 27 September 2025 00:47:34 +0000 (0:00:01.218) 0:03:27.407 **** 2025-09-27 00:49:34.397335 | orchestrator | changed: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-09-27 00:49:34.397342 | orchestrator | changed: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-09-27 00:49:34.397354 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397373 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397384 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397391 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397398 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397409 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-27 00:49:34.397420 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397431 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397438 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-27 00:49:34.397445 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397452 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397466 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397473 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397480 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397502 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397509 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397516 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.397523 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.397535 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397546 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397556 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397564 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397571 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397578 | orchestrator | changed: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}}) 2025-09-27 00:49:34.397589 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397597 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397615 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397626 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.397634 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397646 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.397653 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397671 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.397682 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.397689 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397696 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-27 00:49:34.397708 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397715 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397722 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397740 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397751 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.397758 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397770 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397777 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397784 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397802 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.397813 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.397820 | orchestrator | 2025-09-27 00:49:34.397827 | orchestrator | TASK [haproxy-config : Add configuration for neutron when using single external frontend] *** 2025-09-27 00:49:34.397834 | orchestrator | Saturday 27 September 2025 00:47:38 +0000 (0:00:04.066) 0:03:31.474 **** 2025-09-27 00:49:34.397846 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-09-27 00:49:34.397854 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397861 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397878 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397889 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-27 00:49:34.397901 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397908 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397915 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397922 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397939 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-09-27 00:49:34.397952 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.397960 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397973 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397980 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.397987 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-27 00:49:34.397998 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398008 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398034 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-27 00:49:34.398048 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398055 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398062 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.10:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.398074 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398084 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-server', 'value': {'container_name': 'neutron_server', 'image': 'registry.osism.tech/kolla/neutron-server:2024.2', 'enabled': True, 'group': 'neutron-server', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9696'], 'timeout': '30'}, 'haproxy': {'neutron_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}, 'neutron_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}}}})  2025-09-27 00:49:34.398096 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.398103 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398110 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.398117 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-openvswitch-agent', 'value': {'container_name': 'neutron_openvswitch_agent', 'image': 'registry.osism.tech/kolla/neutron-openvswitch-agent:2024.2', 'enabled': False, 'privileged': True, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-openvswitch-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-openvswitch-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398124 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398142 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-linuxbridge-agent', 'value': {'container_name': 'neutron_linuxbridge_agent', 'image': 'registry.osism.tech/kolla/neutron-linuxbridge-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-linuxbridge-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-linuxbridge-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398153 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.398165 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-dhcp-agent', 'value': {'container_name': 'neutron_dhcp_agent', 'image': 'registry.osism.tech/kolla/neutron-dhcp-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-dhcp-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-dhcp-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-dhcp-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398172 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398179 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-l3-agent', 'value': {'container_name': 'neutron_l3_agent', 'image': 'registry.osism.tech/kolla/neutron-l3-agent:2024.2', 'privileged': True, 'enabled': False, 'environment': {'KOLLA_LEGACY_IPTABLES': 'false'}, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-l3-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', "healthcheck_port 'neutron-l3-agent ' 5672"], 'timeout': '30'}}})  2025-09-27 00:49:34.398186 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398232 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-sriov-agent', 'value': {'container_name': 'neutron_sriov_agent', 'image': 'registry.osism.tech/kolla/neutron-sriov-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-sriov-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-sriov-nic-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398249 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398257 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-mlnx-agent', 'value': {'container_name': 'neutron_mlnx_agent', 'image': 'registry.osism.tech/kolla/neutron-mlnx-agent:2024.2', 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-mlnx-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398264 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398271 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-eswitchd', 'value': {'container_name': 'neutron_eswitchd', 'image': 'registry.osism.tech/kolla/neutron-eswitchd:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-eswitchd/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/run/libvirt:/run/libvirt:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398278 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.11:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.398285 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metadata-agent', 'value': {'container_name': 'neutron_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': False, 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-metadata-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398304 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.398319 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-metadata-agent', 'value': {'container_name': 'neutron_ovn_metadata_agent', 'image': 'registry.osism.tech/kolla/neutron-metadata-agent:2024.2', 'privileged': True, 'enabled': True, 'host_in_groups': False, 'volumes': ['/etc/kolla/neutron-ovn-metadata-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', 'neutron_metadata_socket:/var/lib/neutron/kolla/', '/run/openvswitch:/run/openvswitch:shared', '/run/netns:/run/netns:shared', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-metadata-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.398326 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.398333 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-bgp-dragent', 'value': {'container_name': 'neutron_bgp_dragent', 'image': 'registry.osism.tech/kolla/neutron-bgp-dragent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-bgp-dragent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-bgp-dragent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-bgp-dragent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398340 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-infoblox-ipam-agent', 'value': {'container_name': 'neutron_infoblox_ipam_agent', 'image': 'registry.osism.tech/kolla/neutron-infoblox-ipam-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-infoblox-ipam-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-infoblox-ipam-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398347 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-metering-agent', 'value': {'container_name': 'neutron_metering_agent', 'image': 'registry.osism.tech/kolla/neutron-metering-agent:2024.2', 'privileged': True, 'enabled': False, 'group': 'neutron-metering-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/neutron-metering-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}}})  2025-09-27 00:49:34.398354 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'ironic-neutron-agent', 'value': {'container_name': 'ironic_neutron_agent', 'image': 'registry.osism.tech/kolla/ironic-neutron-agent:2024.2', 'privileged': False, 'enabled': False, 'group': 'ironic-neutron-agent', 'host_in_groups': True, 'volumes': ['/etc/kolla/ironic-neutron-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port ironic-neutron-agent 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398372 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-tls-proxy', 'value': {'container_name': 'neutron_tls_proxy', 'group': 'neutron-server', 'host_in_groups': True, 'enabled': 'no', 'image': 'registry.osism.tech/kolla/neutron-tls-proxy:2024.2', 'volumes': ['/etc/kolla/neutron-tls-proxy/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl -u openstack:password 192.168.16.12:9697'], 'timeout': '30'}, 'haproxy': {'neutron_tls_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}, 'neutron_tls_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696', 'tls_backend': 'yes'}}}})  2025-09-27 00:49:34.398388 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron-ovn-agent', 'value': {'container_name': 'neutron_ovn_agent', 'group': 'neutron-ovn-agent', 'host_in_groups': False, 'enabled': False, 'image': 'registry.osism.tech/dockerhub/kolla/neutron-ovn-agent:2024.2', 'volumes': ['/etc/kolla/neutron-ovn-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port neutron-ovn-agent 6640'], 'timeout': '30'}}})  2025-09-27 00:49:34.398395 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.398401 | orchestrator | 2025-09-27 00:49:34.398408 | orchestrator | TASK [haproxy-config : Configuring firewall for neutron] *********************** 2025-09-27 00:49:34.398415 | orchestrator | Saturday 27 September 2025 00:47:39 +0000 (0:00:01.431) 0:03:32.906 **** 2025-09-27 00:49:34.398422 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-09-27 00:49:34.398429 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-09-27 00:49:34.398435 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.398442 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-09-27 00:49:34.398449 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-09-27 00:49:34.398455 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.398462 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9696', 'listen_port': '9696'}})  2025-09-27 00:49:34.398468 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'neutron_server_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9696', 'listen_port': '9696'}})  2025-09-27 00:49:34.398475 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.398482 | orchestrator | 2025-09-27 00:49:34.398488 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL users config] ************ 2025-09-27 00:49:34.398495 | orchestrator | Saturday 27 September 2025 00:47:41 +0000 (0:00:01.948) 0:03:34.855 **** 2025-09-27 00:49:34.398501 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.398508 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.398515 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.398521 | orchestrator | 2025-09-27 00:49:34.398528 | orchestrator | TASK [proxysql-config : Copying over neutron ProxySQL rules config] ************ 2025-09-27 00:49:34.398535 | orchestrator | Saturday 27 September 2025 00:47:42 +0000 (0:00:01.273) 0:03:36.129 **** 2025-09-27 00:49:34.398541 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.398548 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.398559 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.398565 | orchestrator | 2025-09-27 00:49:34.398572 | orchestrator | TASK [include_role : placement] ************************************************ 2025-09-27 00:49:34.398579 | orchestrator | Saturday 27 September 2025 00:47:44 +0000 (0:00:01.874) 0:03:38.003 **** 2025-09-27 00:49:34.398585 | orchestrator | included: placement for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.398592 | orchestrator | 2025-09-27 00:49:34.398598 | orchestrator | TASK [haproxy-config : Copying over placement haproxy config] ****************** 2025-09-27 00:49:34.398605 | orchestrator | Saturday 27 September 2025 00:47:45 +0000 (0:00:01.095) 0:03:39.099 **** 2025-09-27 00:49:34.398623 | orchestrator | changed: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.398637 | orchestrator | changed: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.398644 | orchestrator | changed: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.398651 | orchestrator | 2025-09-27 00:49:34.398658 | orchestrator | TASK [haproxy-config : Add configuration for placement when using single external frontend] *** 2025-09-27 00:49:34.398664 | orchestrator | Saturday 27 September 2025 00:47:48 +0000 (0:00:03.067) 0:03:42.167 **** 2025-09-27 00:49:34.398671 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.398683 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.398701 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.398709 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.398719 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement-api', 'value': {'container_name': 'placement_api', 'group': 'placement-api', 'image': 'registry.osism.tech/kolla/placement-api:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/placement-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8780'], 'timeout': '30'}, 'haproxy': {'placement_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}, 'placement_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.398726 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.398733 | orchestrator | 2025-09-27 00:49:34.398739 | orchestrator | TASK [haproxy-config : Configuring firewall for placement] ********************* 2025-09-27 00:49:34.398746 | orchestrator | Saturday 27 September 2025 00:47:49 +0000 (0:00:00.460) 0:03:42.627 **** 2025-09-27 00:49:34.398753 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-27 00:49:34.398759 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-27 00:49:34.398766 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.398772 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-27 00:49:34.398779 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-27 00:49:34.398785 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.398791 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-27 00:49:34.398801 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'placement_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8780', 'listen_port': '8780', 'tls_backend': 'no'}})  2025-09-27 00:49:34.398808 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.398814 | orchestrator | 2025-09-27 00:49:34.398820 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL users config] ********** 2025-09-27 00:49:34.398826 | orchestrator | Saturday 27 September 2025 00:47:49 +0000 (0:00:00.666) 0:03:43.294 **** 2025-09-27 00:49:34.398832 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.398838 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.398845 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.398851 | orchestrator | 2025-09-27 00:49:34.398857 | orchestrator | TASK [proxysql-config : Copying over placement ProxySQL rules config] ********** 2025-09-27 00:49:34.398863 | orchestrator | Saturday 27 September 2025 00:47:51 +0000 (0:00:01.233) 0:03:44.527 **** 2025-09-27 00:49:34.398869 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.398875 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.398881 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.398887 | orchestrator | 2025-09-27 00:49:34.398893 | orchestrator | TASK [include_role : nova] ***************************************************** 2025-09-27 00:49:34.398899 | orchestrator | Saturday 27 September 2025 00:47:53 +0000 (0:00:01.968) 0:03:46.496 **** 2025-09-27 00:49:34.398906 | orchestrator | included: nova for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.398912 | orchestrator | 2025-09-27 00:49:34.398918 | orchestrator | TASK [haproxy-config : Copying over nova haproxy config] *********************** 2025-09-27 00:49:34.398924 | orchestrator | Saturday 27 September 2025 00:47:54 +0000 (0:00:01.372) 0:03:47.869 **** 2025-09-27 00:49:34.398945 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.398954 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398960 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398972 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.398978 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.398996 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399006 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.399017 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399024 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399030 | orchestrator | 2025-09-27 00:49:34.399037 | orchestrator | TASK [haproxy-config : Add configuration for nova when using single external frontend] *** 2025-09-27 00:49:34.399043 | orchestrator | Saturday 27 September 2025 00:47:58 +0000 (0:00:03.668) 0:03:51.538 **** 2025-09-27 00:49:34.399061 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.399072 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399078 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399085 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.399096 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.399103 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399109 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399115 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.399134 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-api', 'value': {'container_name': 'nova_api', 'group': 'nova-api', 'image': 'registry.osism.tech/kolla/nova-api:2024.2', 'enabled': True, 'privileged': True, 'volumes': ['/etc/kolla/nova-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/lib/modules:/lib/modules:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:8774 '], 'timeout': '30'}, 'haproxy': {'nova_api': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_api_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}, 'nova_metadata': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}, 'nova_metadata_external': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.399141 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-scheduler', 'value': {'container_name': 'nova_scheduler', 'group': 'nova-scheduler', 'image': 'registry.osism.tech/kolla/nova-scheduler:2024.2', 'enabled': True, 'volumes': ['/etc/kolla/nova-scheduler/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-scheduler 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399154 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-super-conductor', 'value': {'container_name': 'nova_super_conductor', 'group': 'nova-super-conductor', 'enabled': 'no', 'image': 'registry.osism.tech/kolla/nova-super-conductor:2024.2', 'volumes': ['/etc/kolla/nova-super-conductor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port nova-conductor 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.399160 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.399166 | orchestrator | 2025-09-27 00:49:34.399173 | orchestrator | TASK [haproxy-config : Configuring firewall for nova] ************************** 2025-09-27 00:49:34.399179 | orchestrator | Saturday 27 September 2025 00:47:59 +0000 (0:00:00.935) 0:03:52.474 **** 2025-09-27 00:49:34.399185 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399191 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399209 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399215 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399222 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.399228 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399234 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399240 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399247 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399264 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.399271 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399277 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_api_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8774', 'listen_port': '8774', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399294 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399306 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_metadata_external', 'value': {'enabled': 'no', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '8775', 'listen_port': '8775', 'tls_backend': 'no'}})  2025-09-27 00:49:34.399312 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.399318 | orchestrator | 2025-09-27 00:49:34.399324 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL users config] *************** 2025-09-27 00:49:34.399331 | orchestrator | Saturday 27 September 2025 00:47:59 +0000 (0:00:00.851) 0:03:53.326 **** 2025-09-27 00:49:34.399337 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.399343 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.399349 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.399355 | orchestrator | 2025-09-27 00:49:34.399361 | orchestrator | TASK [proxysql-config : Copying over nova ProxySQL rules config] *************** 2025-09-27 00:49:34.399367 | orchestrator | Saturday 27 September 2025 00:48:01 +0000 (0:00:01.310) 0:03:54.637 **** 2025-09-27 00:49:34.399373 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.399380 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.399386 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.399392 | orchestrator | 2025-09-27 00:49:34.399398 | orchestrator | TASK [include_role : nova-cell] ************************************************ 2025-09-27 00:49:34.399404 | orchestrator | Saturday 27 September 2025 00:48:03 +0000 (0:00:01.897) 0:03:56.535 **** 2025-09-27 00:49:34.399410 | orchestrator | included: nova-cell for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.399416 | orchestrator | 2025-09-27 00:49:34.399422 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-novncproxy] ****************** 2025-09-27 00:49:34.399428 | orchestrator | Saturday 27 September 2025 00:48:04 +0000 (0:00:01.419) 0:03:57.954 **** 2025-09-27 00:49:34.399435 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-novncproxy) 2025-09-27 00:49:34.399441 | orchestrator | 2025-09-27 00:49:34.399447 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config] *** 2025-09-27 00:49:34.399453 | orchestrator | Saturday 27 September 2025 00:48:05 +0000 (0:00:00.744) 0:03:58.698 **** 2025-09-27 00:49:34.399460 | orchestrator | changed: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-09-27 00:49:34.399467 | orchestrator | changed: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-09-27 00:49:34.399473 | orchestrator | changed: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}}) 2025-09-27 00:49:34.399480 | orchestrator | 2025-09-27 00:49:34.399486 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-novncproxy when using single external frontend] *** 2025-09-27 00:49:34.399497 | orchestrator | Saturday 27 September 2025 00:48:09 +0000 (0:00:04.151) 0:04:02.850 **** 2025-09-27 00:49:34.399515 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399522 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.399533 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399540 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.399546 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-novncproxy', 'value': {'group': 'nova-novncproxy', 'enabled': True, 'haproxy': {'nova_novncproxy': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_novncproxy_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399552 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.399558 | orchestrator | 2025-09-27 00:49:34.399565 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-novncproxy] ***** 2025-09-27 00:49:34.399571 | orchestrator | Saturday 27 September 2025 00:48:10 +0000 (0:00:01.369) 0:04:04.219 **** 2025-09-27 00:49:34.399577 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-27 00:49:34.399583 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-27 00:49:34.399590 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.399596 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-27 00:49:34.399602 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-27 00:49:34.399609 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.399615 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-27 00:49:34.399622 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova_novncproxy_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6080', 'listen_port': '6080', 'backend_http_extra': ['timeout tunnel 1h']}})  2025-09-27 00:49:34.399628 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.399652 | orchestrator | 2025-09-27 00:49:34.399659 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-09-27 00:49:34.399665 | orchestrator | Saturday 27 September 2025 00:48:12 +0000 (0:00:01.544) 0:04:05.763 **** 2025-09-27 00:49:34.399671 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.399677 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.399683 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.399689 | orchestrator | 2025-09-27 00:49:34.399695 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-09-27 00:49:34.399701 | orchestrator | Saturday 27 September 2025 00:48:14 +0000 (0:00:02.428) 0:04:08.191 **** 2025-09-27 00:49:34.399708 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.399714 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.399720 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.399726 | orchestrator | 2025-09-27 00:49:34.399732 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-spicehtml5proxy] ************* 2025-09-27 00:49:34.399738 | orchestrator | Saturday 27 September 2025 00:48:17 +0000 (0:00:02.942) 0:04:11.134 **** 2025-09-27 00:49:34.399755 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-spicehtml5proxy) 2025-09-27 00:49:34.399762 | orchestrator | 2025-09-27 00:49:34.399768 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-spicehtml5proxy haproxy config] *** 2025-09-27 00:49:34.399775 | orchestrator | Saturday 27 September 2025 00:48:19 +0000 (0:00:01.333) 0:04:12.468 **** 2025-09-27 00:49:34.399784 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399791 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.399797 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399804 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.399810 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399817 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.399823 | orchestrator | 2025-09-27 00:49:34.399829 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-spicehtml5proxy when using single external frontend] *** 2025-09-27 00:49:34.399835 | orchestrator | Saturday 27 September 2025 00:48:20 +0000 (0:00:01.210) 0:04:13.678 **** 2025-09-27 00:49:34.399841 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399852 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.399858 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399865 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.399871 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-spicehtml5proxy', 'value': {'group': 'nova-spicehtml5proxy', 'enabled': False, 'haproxy': {'nova_spicehtml5proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}, 'nova_spicehtml5proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6082', 'listen_port': '6082', 'backend_http_extra': ['timeout tunnel 1h']}}}})  2025-09-27 00:49:34.399878 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.399884 | orchestrator | 2025-09-27 00:49:34.399890 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-spicehtml5proxy] *** 2025-09-27 00:49:34.399896 | orchestrator | Saturday 27 September 2025 00:48:21 +0000 (0:00:01.224) 0:04:14.903 **** 2025-09-27 00:49:34.399902 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.399908 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.399914 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.399921 | orchestrator | 2025-09-27 00:49:34.399937 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-09-27 00:49:34.399944 | orchestrator | Saturday 27 September 2025 00:48:23 +0000 (0:00:01.696) 0:04:16.599 **** 2025-09-27 00:49:34.399950 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:49:34.399956 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:49:34.399962 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:49:34.399969 | orchestrator | 2025-09-27 00:49:34.399975 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-09-27 00:49:34.399981 | orchestrator | Saturday 27 September 2025 00:48:25 +0000 (0:00:02.734) 0:04:19.334 **** 2025-09-27 00:49:34.399987 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:49:34.399993 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:49:34.399999 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:49:34.400005 | orchestrator | 2025-09-27 00:49:34.400015 | orchestrator | TASK [nova-cell : Configure loadbalancer for nova-serialproxy] ***************** 2025-09-27 00:49:34.400021 | orchestrator | Saturday 27 September 2025 00:48:28 +0000 (0:00:02.990) 0:04:22.324 **** 2025-09-27 00:49:34.400027 | orchestrator | included: /ansible/roles/nova-cell/tasks/cell_proxy_loadbalancer.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item=nova-serialproxy) 2025-09-27 00:49:34.400033 | orchestrator | 2025-09-27 00:49:34.400039 | orchestrator | TASK [haproxy-config : Copying over nova-cell:nova-serialproxy haproxy config] *** 2025-09-27 00:49:34.400046 | orchestrator | Saturday 27 September 2025 00:48:29 +0000 (0:00:00.812) 0:04:23.136 **** 2025-09-27 00:49:34.400052 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-27 00:49:34.400063 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.400069 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-27 00:49:34.400075 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.400082 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-27 00:49:34.400088 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.400094 | orchestrator | 2025-09-27 00:49:34.400100 | orchestrator | TASK [haproxy-config : Add configuration for nova-cell:nova-serialproxy when using single external frontend] *** 2025-09-27 00:49:34.400106 | orchestrator | Saturday 27 September 2025 00:48:31 +0000 (0:00:01.400) 0:04:24.537 **** 2025-09-27 00:49:34.400113 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-27 00:49:34.400119 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.400125 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-27 00:49:34.400132 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.400149 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'nova-serialproxy', 'value': {'group': 'nova-serialproxy', 'enabled': False, 'haproxy': {'nova_serialconsole_proxy': {'enabled': False, 'mode': 'http', 'external': False, 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}, 'nova_serialconsole_proxy_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '6083', 'listen_port': '6083', 'backend_http_extra': ['timeout tunnel 10m']}}}})  2025-09-27 00:49:34.400156 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.400162 | orchestrator | 2025-09-27 00:49:34.400171 | orchestrator | TASK [haproxy-config : Configuring firewall for nova-cell:nova-serialproxy] **** 2025-09-27 00:49:34.400178 | orchestrator | Saturday 27 September 2025 00:48:32 +0000 (0:00:01.353) 0:04:25.891 **** 2025-09-27 00:49:34.400184 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.400190 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.400196 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.400213 | orchestrator | 2025-09-27 00:49:34.400220 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL users config] ********** 2025-09-27 00:49:34.400230 | orchestrator | Saturday 27 September 2025 00:48:34 +0000 (0:00:01.511) 0:04:27.403 **** 2025-09-27 00:49:34.400236 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:49:34.400242 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:49:34.400248 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:49:34.400254 | orchestrator | 2025-09-27 00:49:34.400260 | orchestrator | TASK [proxysql-config : Copying over nova-cell ProxySQL rules config] ********** 2025-09-27 00:49:34.400266 | orchestrator | Saturday 27 September 2025 00:48:36 +0000 (0:00:02.357) 0:04:29.760 **** 2025-09-27 00:49:34.400272 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:49:34.400279 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:49:34.400285 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:49:34.400291 | orchestrator | 2025-09-27 00:49:34.400297 | orchestrator | TASK [include_role : octavia] ************************************************** 2025-09-27 00:49:34.400303 | orchestrator | Saturday 27 September 2025 00:48:39 +0000 (0:00:03.202) 0:04:32.962 **** 2025-09-27 00:49:34.400309 | orchestrator | included: octavia for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.400315 | orchestrator | 2025-09-27 00:49:34.400321 | orchestrator | TASK [haproxy-config : Copying over octavia haproxy config] ******************** 2025-09-27 00:49:34.400327 | orchestrator | Saturday 27 September 2025 00:48:41 +0000 (0:00:01.526) 0:04:34.489 **** 2025-09-27 00:49:34.400334 | orchestrator | changed: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.400340 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-27 00:49:34.400347 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400364 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400379 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.400385 | orchestrator | changed: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.400392 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-27 00:49:34.400398 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400405 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400422 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.400431 | orchestrator | changed: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.400442 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-27 00:49:34.400449 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400455 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400462 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.400468 | orchestrator | 2025-09-27 00:49:34.400474 | orchestrator | TASK [haproxy-config : Add configuration for octavia when using single external frontend] *** 2025-09-27 00:49:34.400480 | orchestrator | Saturday 27 September 2025 00:48:44 +0000 (0:00:03.343) 0:04:37.833 **** 2025-09-27 00:49:34.400498 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.400512 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-27 00:49:34.400519 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400526 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400532 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.400538 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.400545 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.400562 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-27 00:49:34.400578 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400585 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400591 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.400598 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.400604 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-api', 'value': {'container_name': 'octavia_api', 'group': 'octavia-api', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-api:2024.2', 'volumes': ['/etc/kolla/octavia-api/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9876'], 'timeout': '30'}, 'haproxy': {'octavia_api': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}, 'octavia_api_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.400611 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-driver-agent', 'value': {'container_name': 'octavia_driver_agent', 'group': 'octavia-driver-agent', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-driver-agent:2024.2', 'volumes': ['/etc/kolla/octavia-driver-agent/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', '', 'octavia_driver_agent:/var/run/octavia/'], 'dimensions': {}}})  2025-09-27 00:49:34.400617 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-health-manager', 'value': {'container_name': 'octavia_health_manager', 'group': 'octavia-health-manager', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-health-manager:2024.2', 'volumes': ['/etc/kolla/octavia-health-manager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-health-manager 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400638 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-housekeeping', 'value': {'container_name': 'octavia_housekeeping', 'group': 'octavia-housekeeping', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-housekeeping:2024.2', 'volumes': ['/etc/kolla/octavia-housekeeping/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-housekeeping 3306'], 'timeout': '30'}}})  2025-09-27 00:49:34.400648 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia-worker', 'value': {'container_name': 'octavia_worker', 'group': 'octavia-worker', 'enabled': True, 'image': 'registry.osism.tech/kolla/octavia-worker:2024.2', 'volumes': ['/etc/kolla/octavia-worker/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '', ''], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_port octavia-worker 5672'], 'timeout': '30'}}})  2025-09-27 00:49:34.400655 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.400661 | orchestrator | 2025-09-27 00:49:34.400667 | orchestrator | TASK [haproxy-config : Configuring firewall for octavia] *********************** 2025-09-27 00:49:34.400674 | orchestrator | Saturday 27 September 2025 00:48:45 +0000 (0:00:00.730) 0:04:38.563 **** 2025-09-27 00:49:34.400680 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-27 00:49:34.400686 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-27 00:49:34.400692 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.400699 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-27 00:49:34.400705 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-27 00:49:34.400711 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.400717 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-27 00:49:34.400723 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'octavia_api_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9876', 'listen_port': '9876', 'tls_backend': 'no'}})  2025-09-27 00:49:34.400730 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.400736 | orchestrator | 2025-09-27 00:49:34.400742 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL users config] ************ 2025-09-27 00:49:34.400748 | orchestrator | Saturday 27 September 2025 00:48:46 +0000 (0:00:01.431) 0:04:39.995 **** 2025-09-27 00:49:34.400754 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.400760 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.400766 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.400776 | orchestrator | 2025-09-27 00:49:34.400782 | orchestrator | TASK [proxysql-config : Copying over octavia ProxySQL rules config] ************ 2025-09-27 00:49:34.400789 | orchestrator | Saturday 27 September 2025 00:48:48 +0000 (0:00:01.372) 0:04:41.367 **** 2025-09-27 00:49:34.400795 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.400801 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.400807 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.400813 | orchestrator | 2025-09-27 00:49:34.400819 | orchestrator | TASK [include_role : opensearch] *********************************************** 2025-09-27 00:49:34.400825 | orchestrator | Saturday 27 September 2025 00:48:49 +0000 (0:00:01.977) 0:04:43.345 **** 2025-09-27 00:49:34.400831 | orchestrator | included: opensearch for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.400837 | orchestrator | 2025-09-27 00:49:34.400844 | orchestrator | TASK [haproxy-config : Copying over opensearch haproxy config] ***************** 2025-09-27 00:49:34.400850 | orchestrator | Saturday 27 September 2025 00:48:51 +0000 (0:00:01.381) 0:04:44.726 **** 2025-09-27 00:49:34.400868 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:49:34.400878 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:49:34.400885 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:49:34.400892 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:49:34.400915 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:49:34.400926 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:49:34.400933 | orchestrator | 2025-09-27 00:49:34.400939 | orchestrator | TASK [haproxy-config : Add configuration for opensearch when using single external frontend] *** 2025-09-27 00:49:34.400945 | orchestrator | Saturday 27 September 2025 00:48:56 +0000 (0:00:05.286) 0:04:50.012 **** 2025-09-27 00:49:34.400952 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:49:34.400959 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:49:34.400969 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.400976 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:49:34.400997 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:49:34.401005 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.401011 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:49:34.401018 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:49:34.401028 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.401035 | orchestrator | 2025-09-27 00:49:34.401041 | orchestrator | TASK [haproxy-config : Configuring firewall for opensearch] ******************** 2025-09-27 00:49:34.401047 | orchestrator | Saturday 27 September 2025 00:48:57 +0000 (0:00:00.630) 0:04:50.643 **** 2025-09-27 00:49:34.401053 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-09-27 00:49:34.401059 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-27 00:49:34.401066 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-27 00:49:34.401072 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.401078 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-09-27 00:49:34.401095 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-27 00:49:34.401102 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-27 00:49:34.401108 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.401118 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}})  2025-09-27 00:49:34.401124 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-27 00:49:34.401131 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch_dashboards_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}})  2025-09-27 00:49:34.401137 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.401143 | orchestrator | 2025-09-27 00:49:34.401149 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL users config] ********* 2025-09-27 00:49:34.401155 | orchestrator | Saturday 27 September 2025 00:48:58 +0000 (0:00:00.909) 0:04:51.552 **** 2025-09-27 00:49:34.401162 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.401172 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.401178 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.401184 | orchestrator | 2025-09-27 00:49:34.401190 | orchestrator | TASK [proxysql-config : Copying over opensearch ProxySQL rules config] ********* 2025-09-27 00:49:34.401196 | orchestrator | Saturday 27 September 2025 00:48:58 +0000 (0:00:00.794) 0:04:52.347 **** 2025-09-27 00:49:34.401214 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.401220 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.401226 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.401232 | orchestrator | 2025-09-27 00:49:34.401238 | orchestrator | TASK [include_role : prometheus] *********************************************** 2025-09-27 00:49:34.401244 | orchestrator | Saturday 27 September 2025 00:49:00 +0000 (0:00:01.304) 0:04:53.652 **** 2025-09-27 00:49:34.401250 | orchestrator | included: prometheus for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.401256 | orchestrator | 2025-09-27 00:49:34.401263 | orchestrator | TASK [haproxy-config : Copying over prometheus haproxy config] ***************** 2025-09-27 00:49:34.401269 | orchestrator | Saturday 27 September 2025 00:49:01 +0000 (0:00:01.376) 0:04:55.028 **** 2025-09-27 00:49:34.401275 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-27 00:49:34.401282 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:49:34.401289 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401306 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401317 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401328 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-27 00:49:34.401334 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:49:34.401341 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-27 00:49:34.401347 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401354 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:49:34.401364 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401374 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401386 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401393 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401399 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401406 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-27 00:49:34.401418 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-27 00:49:34.401428 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401439 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401446 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401453 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-27 00:49:34.401460 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-27 00:49:34.401470 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401476 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401490 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401497 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-27 00:49:34.401504 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-27 00:49:34.401511 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401517 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401527 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401539 | orchestrator | 2025-09-27 00:49:34.401546 | orchestrator | TASK [haproxy-config : Add configuration for prometheus when using single external frontend] *** 2025-09-27 00:49:34.401552 | orchestrator | Saturday 27 September 2025 00:49:06 +0000 (0:00:04.395) 0:04:59.424 **** 2025-09-27 00:49:34.401561 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-27 00:49:34.401568 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:49:34.401574 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401581 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401587 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401598 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-27 00:49:34.401612 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-27 00:49:34.401619 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401625 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401632 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401638 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.401645 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-27 00:49:34.401651 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:49:34.401666 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401676 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401683 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401689 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-27 00:49:34.401696 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-27 00:49:34.401703 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401717 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401724 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401730 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.401740 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-27 00:49:34.401746 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:49:34.401753 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401759 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401765 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401779 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-27 00:49:34.401790 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-openstack-exporter', 'value': {'container_name': 'prometheus_openstack_exporter', 'group': 'prometheus-openstack-exporter', 'enabled': False, 'environment': {'OS_COMPUTE_API_VERSION': 'latest'}, 'image': 'registry.osism.tech/kolla/prometheus-openstack-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-openstack-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_openstack_exporter': {'enabled': False, 'mode': 'http', 'external': False, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}, 'prometheus_openstack_exporter_external': {'enabled': False, 'mode': 'http', 'external': True, 'port': '9198', 'backend_http_extra': ['timeout server 45s']}}}})  2025-09-27 00:49:34.401796 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401803 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:49:34.401809 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:49:34.401816 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.401822 | orchestrator | 2025-09-27 00:49:34.401828 | orchestrator | TASK [haproxy-config : Configuring firewall for prometheus] ******************** 2025-09-27 00:49:34.401834 | orchestrator | Saturday 27 September 2025 00:49:07 +0000 (0:00:01.195) 0:05:00.620 **** 2025-09-27 00:49:34.401841 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-09-27 00:49:34.401847 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-09-27 00:49:34.401859 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-27 00:49:34.401866 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-27 00:49:34.401873 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.401879 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-09-27 00:49:34.401888 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-09-27 00:49:34.401895 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-27 00:49:34.401905 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-27 00:49:34.401911 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.401917 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}})  2025-09-27 00:49:34.401924 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_server_external', 'value': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}})  2025-09-27 00:49:34.401930 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager', 'value': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-27 00:49:34.401937 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus_alertmanager_external', 'value': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}})  2025-09-27 00:49:34.401943 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.401949 | orchestrator | 2025-09-27 00:49:34.401955 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL users config] ********* 2025-09-27 00:49:34.401962 | orchestrator | Saturday 27 September 2025 00:49:08 +0000 (0:00:00.961) 0:05:01.581 **** 2025-09-27 00:49:34.401968 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.401974 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.401980 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.401986 | orchestrator | 2025-09-27 00:49:34.401992 | orchestrator | TASK [proxysql-config : Copying over prometheus ProxySQL rules config] ********* 2025-09-27 00:49:34.401998 | orchestrator | Saturday 27 September 2025 00:49:08 +0000 (0:00:00.451) 0:05:02.033 **** 2025-09-27 00:49:34.402008 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.402035 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.402042 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.402049 | orchestrator | 2025-09-27 00:49:34.402055 | orchestrator | TASK [include_role : rabbitmq] ************************************************* 2025-09-27 00:49:34.402061 | orchestrator | Saturday 27 September 2025 00:49:10 +0000 (0:00:01.377) 0:05:03.410 **** 2025-09-27 00:49:34.402067 | orchestrator | included: rabbitmq for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.402073 | orchestrator | 2025-09-27 00:49:34.402079 | orchestrator | TASK [haproxy-config : Copying over rabbitmq haproxy config] ******************* 2025-09-27 00:49:34.402086 | orchestrator | Saturday 27 September 2025 00:49:11 +0000 (0:00:01.722) 0:05:05.132 **** 2025-09-27 00:49:34.402092 | orchestrator | changed: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:49:34.402107 | orchestrator | changed: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:49:34.402114 | orchestrator | changed: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}}) 2025-09-27 00:49:34.402121 | orchestrator | 2025-09-27 00:49:34.402128 | orchestrator | TASK [haproxy-config : Add configuration for rabbitmq when using single external frontend] *** 2025-09-27 00:49:34.402134 | orchestrator | Saturday 27 September 2025 00:49:14 +0000 (0:00:02.364) 0:05:07.496 **** 2025-09-27 00:49:34.402140 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-09-27 00:49:34.402151 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-09-27 00:49:34.402158 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.402164 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.402178 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq', 'value': {'container_name': 'rabbitmq', 'group': None, 'enabled': True, 'image': 'registry.osism.tech/kolla/rabbitmq:2024.2', 'bootstrap_environment': {'KOLLA_BOOTSTRAP': None, 'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'RABBITMQ_CLUSTER_COOKIE': None, 'RABBITMQ_LOG_DIR': '/var/log/kolla/rabbitmq'}, 'volumes': ['/etc/kolla/rabbitmq/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'rabbitmq:/var/lib/rabbitmq/', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_rabbitmq'], 'timeout': '30'}, 'haproxy': {'rabbitmq_management': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}}}})  2025-09-27 00:49:34.402185 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.402191 | orchestrator | 2025-09-27 00:49:34.402209 | orchestrator | TASK [haproxy-config : Configuring firewall for rabbitmq] ********************** 2025-09-27 00:49:34.402215 | orchestrator | Saturday 27 September 2025 00:49:14 +0000 (0:00:00.423) 0:05:07.920 **** 2025-09-27 00:49:34.402222 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-09-27 00:49:34.402228 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.402234 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-09-27 00:49:34.402240 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.402246 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'rabbitmq_management', 'value': {'enabled': 'yes', 'mode': 'http', 'port': '15672', 'host_group': 'rabbitmq'}})  2025-09-27 00:49:34.402257 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.402263 | orchestrator | 2025-09-27 00:49:34.402269 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL users config] *********** 2025-09-27 00:49:34.402275 | orchestrator | Saturday 27 September 2025 00:49:15 +0000 (0:00:00.938) 0:05:08.859 **** 2025-09-27 00:49:34.402282 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.402288 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.402294 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.402300 | orchestrator | 2025-09-27 00:49:34.402306 | orchestrator | TASK [proxysql-config : Copying over rabbitmq ProxySQL rules config] *********** 2025-09-27 00:49:34.402312 | orchestrator | Saturday 27 September 2025 00:49:15 +0000 (0:00:00.423) 0:05:09.283 **** 2025-09-27 00:49:34.402318 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.402324 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.402330 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.402336 | orchestrator | 2025-09-27 00:49:34.402342 | orchestrator | TASK [include_role : skyline] ************************************************** 2025-09-27 00:49:34.402348 | orchestrator | Saturday 27 September 2025 00:49:17 +0000 (0:00:01.306) 0:05:10.589 **** 2025-09-27 00:49:34.402354 | orchestrator | included: skyline for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:49:34.402360 | orchestrator | 2025-09-27 00:49:34.402367 | orchestrator | TASK [haproxy-config : Copying over skyline haproxy config] ******************** 2025-09-27 00:49:34.402373 | orchestrator | Saturday 27 September 2025 00:49:18 +0000 (0:00:01.737) 0:05:12.327 **** 2025-09-27 00:49:34.402379 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.402390 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.402400 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.402411 | orchestrator | changed: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.402419 | orchestrator | changed: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.402425 | orchestrator | changed: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}}) 2025-09-27 00:49:34.402432 | orchestrator | 2025-09-27 00:49:34.402441 | orchestrator | TASK [haproxy-config : Add configuration for skyline when using single external frontend] *** 2025-09-27 00:49:34.402447 | orchestrator | Saturday 27 September 2025 00:49:25 +0000 (0:00:06.060) 0:05:18.388 **** 2025-09-27 00:49:34.402454 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.402464 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.402471 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.402477 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.402524 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.402538 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.402553 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-apiserver', 'value': {'container_name': 'skyline_apiserver', 'group': 'skyline-apiserver', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-apiserver:2024.2', 'volumes': ['/etc/kolla/skyline-apiserver/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9998/docs'], 'timeout': '30'}, 'haproxy': {'skyline_apiserver': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}, 'skyline_apiserver_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.402565 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline-console', 'value': {'container_name': 'skyline_console', 'group': 'skyline-console', 'enabled': True, 'image': 'registry.osism.tech/kolla/skyline-console:2024.2', 'volumes': ['/etc/kolla/skyline-console/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9999/docs'], 'timeout': '30'}, 'haproxy': {'skyline_console': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}, 'skyline_console_external': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}}}})  2025-09-27 00:49:34.402571 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.402578 | orchestrator | 2025-09-27 00:49:34.402584 | orchestrator | TASK [haproxy-config : Configuring firewall for skyline] *********************** 2025-09-27 00:49:34.402590 | orchestrator | Saturday 27 September 2025 00:49:25 +0000 (0:00:00.641) 0:05:19.029 **** 2025-09-27 00:49:34.402597 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402603 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402609 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402616 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402622 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:49:34.402628 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402635 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402641 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402647 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402654 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:49:34.402660 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402669 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_apiserver_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9998', 'listen_port': '9998', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402679 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console', 'value': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402686 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'skyline_console_external', 'value': {'enabled': 'yes', 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9999', 'listen_port': '9999', 'tls_backend': 'no'}})  2025-09-27 00:49:34.402695 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:49:34.402701 | orchestrator | 2025-09-27 00:49:34.402708 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL users config] ************ 2025-09-27 00:49:34.402714 | orchestrator | Saturday 27 September 2025 00:49:27 +0000 (0:00:01.573) 0:05:20.603 **** 2025-09-27 00:49:34.402720 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.402726 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.402732 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.402738 | orchestrator | 2025-09-27 00:49:34.402744 | orchestrator | TASK [proxysql-config : Copying over skyline ProxySQL rules config] ************ 2025-09-27 00:49:34.402751 | orchestrator | Saturday 27 September 2025 00:49:28 +0000 (0:00:01.290) 0:05:21.894 **** 2025-09-27 00:49:34.402757 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:49:34.402763 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:49:34.402769 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:49:34.402775 | orchestrator | 2025-09-27 00:49:34.402781 | orchestrator | TASK [include_role : swift] **************************************************** 2025-09-27 00:49:34.402787 | orchestrator | Saturday 27 September 2025 00:49:30 +0000 (0:00:02.211) 0:05:24.105 **** 2025-09-27 00:49:34.402794 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"msg": "The conditional check 'enable_swift | bool' failed. The error was: error while evaluating conditional (enable_swift | bool): 'enable_swift' is undefined\n\nThe error appears to be in '/ansible/kolla-loadbalancer.yml': line 207, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n when: enable_skyline | bool\n - include_role:\n ^ here\n"} 2025-09-27 00:49:34.402801 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"msg": "The conditional check 'enable_swift | bool' failed. The error was: error while evaluating conditional (enable_swift | bool): 'enable_swift' is undefined\n\nThe error appears to be in '/ansible/kolla-loadbalancer.yml': line 207, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n when: enable_skyline | bool\n - include_role:\n ^ here\n"} 2025-09-27 00:49:34.402807 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"msg": "The conditional check 'enable_swift | bool' failed. The error was: error while evaluating conditional (enable_swift | bool): 'enable_swift' is undefined\n\nThe error appears to be in '/ansible/kolla-loadbalancer.yml': line 207, column 7, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n when: enable_skyline | bool\n - include_role:\n ^ here\n"} 2025-09-27 00:49:34.402814 | orchestrator | 2025-09-27 00:49:34.402820 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:49:34.402826 | orchestrator | testbed-node-0 : ok=111  changed=73  unreachable=0 failed=1  skipped=85  rescued=0 ignored=0 2025-09-27 00:49:34.402832 | orchestrator | testbed-node-1 : ok=110  changed=73  unreachable=0 failed=1  skipped=85  rescued=0 ignored=0 2025-09-27 00:49:34.402838 | orchestrator | testbed-node-2 : ok=110  changed=73  unreachable=0 failed=1  skipped=85  rescued=0 ignored=0 2025-09-27 00:49:34.402849 | orchestrator | 2025-09-27 00:49:34.402855 | orchestrator | 2025-09-27 00:49:34.402861 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:49:34.402867 | orchestrator | Saturday 27 September 2025 00:49:31 +0000 (0:00:00.247) 0:05:24.353 **** 2025-09-27 00:49:34.402873 | orchestrator | =============================================================================== 2025-09-27 00:49:34.402880 | orchestrator | haproxy-config : Copying over skyline haproxy config -------------------- 6.06s 2025-09-27 00:49:34.402886 | orchestrator | haproxy-config : Copying over opensearch haproxy config ----------------- 5.29s 2025-09-27 00:49:34.402892 | orchestrator | haproxy-config : Copying over designate haproxy config ------------------ 4.86s 2025-09-27 00:49:34.402898 | orchestrator | haproxy-config : Copying over manila haproxy config --------------------- 4.56s 2025-09-27 00:49:34.402904 | orchestrator | haproxy-config : Copying over prometheus haproxy config ----------------- 4.40s 2025-09-27 00:49:34.402910 | orchestrator | haproxy-config : Copying over grafana haproxy config -------------------- 4.37s 2025-09-27 00:49:34.402920 | orchestrator | haproxy-config : Copying over cinder haproxy config --------------------- 4.37s 2025-09-27 00:49:34.402926 | orchestrator | haproxy-config : Copying over glance haproxy config --------------------- 4.18s 2025-09-27 00:49:34.402932 | orchestrator | haproxy-config : Copying over nova-cell:nova-novncproxy haproxy config --- 4.15s 2025-09-27 00:49:34.402939 | orchestrator | haproxy-config : Copying over keystone haproxy config ------------------- 4.07s 2025-09-27 00:49:34.402945 | orchestrator | haproxy-config : Copying over neutron haproxy config -------------------- 4.07s 2025-09-27 00:49:34.402951 | orchestrator | loadbalancer : Copying over config.json files for services -------------- 4.00s 2025-09-27 00:49:34.402957 | orchestrator | haproxy-config : Copying over horizon haproxy config -------------------- 3.97s 2025-09-27 00:49:34.402966 | orchestrator | haproxy-config : Copying over magnum haproxy config --------------------- 3.85s 2025-09-27 00:49:34.402972 | orchestrator | sysctl : Setting sysctl values ------------------------------------------ 3.85s 2025-09-27 00:49:34.402979 | orchestrator | haproxy-config : Copying over nova haproxy config ----------------------- 3.67s 2025-09-27 00:49:34.402985 | orchestrator | haproxy-config : Configuring firewall for glance ------------------------ 3.58s 2025-09-27 00:49:34.402991 | orchestrator | haproxy-config : Add configuration for glance when using single external frontend --- 3.39s 2025-09-27 00:49:34.402997 | orchestrator | loadbalancer : Ensuring config directories exist ------------------------ 3.39s 2025-09-27 00:49:34.403003 | orchestrator | loadbalancer : Copying checks for services which are enabled ------------ 3.37s 2025-09-27 00:49:34.403009 | orchestrator | 2025-09-27 00:49:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:37.412999 | orchestrator | 2025-09-27 00:49:37 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:37.414307 | orchestrator | 2025-09-27 00:49:37 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:37.415758 | orchestrator | 2025-09-27 00:49:37 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:37.415786 | orchestrator | 2025-09-27 00:49:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:40.464827 | orchestrator | 2025-09-27 00:49:40 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:40.466251 | orchestrator | 2025-09-27 00:49:40 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:40.467521 | orchestrator | 2025-09-27 00:49:40 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:40.468032 | orchestrator | 2025-09-27 00:49:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:43.503645 | orchestrator | 2025-09-27 00:49:43 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:43.503971 | orchestrator | 2025-09-27 00:49:43 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:43.504630 | orchestrator | 2025-09-27 00:49:43 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:43.504757 | orchestrator | 2025-09-27 00:49:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:46.537184 | orchestrator | 2025-09-27 00:49:46 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:46.537457 | orchestrator | 2025-09-27 00:49:46 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:46.538261 | orchestrator | 2025-09-27 00:49:46 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:46.538290 | orchestrator | 2025-09-27 00:49:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:49.566534 | orchestrator | 2025-09-27 00:49:49 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:49.567656 | orchestrator | 2025-09-27 00:49:49 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:49.568329 | orchestrator | 2025-09-27 00:49:49 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:49.568472 | orchestrator | 2025-09-27 00:49:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:52.606609 | orchestrator | 2025-09-27 00:49:52 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:52.608553 | orchestrator | 2025-09-27 00:49:52 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:52.611585 | orchestrator | 2025-09-27 00:49:52 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:52.611604 | orchestrator | 2025-09-27 00:49:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:55.644422 | orchestrator | 2025-09-27 00:49:55 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:55.644864 | orchestrator | 2025-09-27 00:49:55 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:55.645626 | orchestrator | 2025-09-27 00:49:55 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:55.645745 | orchestrator | 2025-09-27 00:49:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:49:58.681175 | orchestrator | 2025-09-27 00:49:58 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:49:58.682353 | orchestrator | 2025-09-27 00:49:58 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:49:58.684127 | orchestrator | 2025-09-27 00:49:58 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:49:58.684152 | orchestrator | 2025-09-27 00:49:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:01.729038 | orchestrator | 2025-09-27 00:50:01 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:01.729685 | orchestrator | 2025-09-27 00:50:01 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:01.731579 | orchestrator | 2025-09-27 00:50:01 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:01.731608 | orchestrator | 2025-09-27 00:50:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:04.784361 | orchestrator | 2025-09-27 00:50:04 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:04.785998 | orchestrator | 2025-09-27 00:50:04 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:04.787789 | orchestrator | 2025-09-27 00:50:04 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:04.788478 | orchestrator | 2025-09-27 00:50:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:07.831005 | orchestrator | 2025-09-27 00:50:07 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:07.833721 | orchestrator | 2025-09-27 00:50:07 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:07.836334 | orchestrator | 2025-09-27 00:50:07 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:07.836859 | orchestrator | 2025-09-27 00:50:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:10.887381 | orchestrator | 2025-09-27 00:50:10 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:10.888587 | orchestrator | 2025-09-27 00:50:10 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:10.889631 | orchestrator | 2025-09-27 00:50:10 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:10.889655 | orchestrator | 2025-09-27 00:50:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:13.936838 | orchestrator | 2025-09-27 00:50:13 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:13.938321 | orchestrator | 2025-09-27 00:50:13 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:13.941945 | orchestrator | 2025-09-27 00:50:13 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:13.941969 | orchestrator | 2025-09-27 00:50:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:16.989129 | orchestrator | 2025-09-27 00:50:16 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:16.990132 | orchestrator | 2025-09-27 00:50:16 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:16.995419 | orchestrator | 2025-09-27 00:50:16 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:16.995526 | orchestrator | 2025-09-27 00:50:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:20.040863 | orchestrator | 2025-09-27 00:50:20 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:20.042174 | orchestrator | 2025-09-27 00:50:20 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:20.044132 | orchestrator | 2025-09-27 00:50:20 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:20.044155 | orchestrator | 2025-09-27 00:50:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:23.089359 | orchestrator | 2025-09-27 00:50:23 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:23.089820 | orchestrator | 2025-09-27 00:50:23 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:23.092242 | orchestrator | 2025-09-27 00:50:23 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:23.092287 | orchestrator | 2025-09-27 00:50:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:26.136386 | orchestrator | 2025-09-27 00:50:26 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:26.137604 | orchestrator | 2025-09-27 00:50:26 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:26.139643 | orchestrator | 2025-09-27 00:50:26 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:26.139694 | orchestrator | 2025-09-27 00:50:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:29.183739 | orchestrator | 2025-09-27 00:50:29 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:29.184615 | orchestrator | 2025-09-27 00:50:29 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:29.186157 | orchestrator | 2025-09-27 00:50:29 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:29.186229 | orchestrator | 2025-09-27 00:50:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:32.231024 | orchestrator | 2025-09-27 00:50:32 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:32.233103 | orchestrator | 2025-09-27 00:50:32 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:32.236035 | orchestrator | 2025-09-27 00:50:32 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:32.236063 | orchestrator | 2025-09-27 00:50:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:35.286812 | orchestrator | 2025-09-27 00:50:35 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:35.286911 | orchestrator | 2025-09-27 00:50:35 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:35.288709 | orchestrator | 2025-09-27 00:50:35 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:35.288735 | orchestrator | 2025-09-27 00:50:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:38.329749 | orchestrator | 2025-09-27 00:50:38 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:38.331999 | orchestrator | 2025-09-27 00:50:38 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:38.333604 | orchestrator | 2025-09-27 00:50:38 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:38.334215 | orchestrator | 2025-09-27 00:50:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:41.390804 | orchestrator | 2025-09-27 00:50:41 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:41.392541 | orchestrator | 2025-09-27 00:50:41 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:41.393992 | orchestrator | 2025-09-27 00:50:41 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:41.394064 | orchestrator | 2025-09-27 00:50:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:44.433339 | orchestrator | 2025-09-27 00:50:44 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:44.435336 | orchestrator | 2025-09-27 00:50:44 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:44.437278 | orchestrator | 2025-09-27 00:50:44 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:44.437304 | orchestrator | 2025-09-27 00:50:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:47.478344 | orchestrator | 2025-09-27 00:50:47 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:47.478485 | orchestrator | 2025-09-27 00:50:47 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:47.479398 | orchestrator | 2025-09-27 00:50:47 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:47.479419 | orchestrator | 2025-09-27 00:50:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:50.521619 | orchestrator | 2025-09-27 00:50:50 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:50.522907 | orchestrator | 2025-09-27 00:50:50 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:50.524740 | orchestrator | 2025-09-27 00:50:50 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:50.524900 | orchestrator | 2025-09-27 00:50:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:53.569335 | orchestrator | 2025-09-27 00:50:53 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:53.573011 | orchestrator | 2025-09-27 00:50:53 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:53.574135 | orchestrator | 2025-09-27 00:50:53 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:53.574379 | orchestrator | 2025-09-27 00:50:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:56.628419 | orchestrator | 2025-09-27 00:50:56 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:56.631082 | orchestrator | 2025-09-27 00:50:56 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:56.633035 | orchestrator | 2025-09-27 00:50:56 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:56.633084 | orchestrator | 2025-09-27 00:50:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:50:59.678453 | orchestrator | 2025-09-27 00:50:59 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:50:59.681084 | orchestrator | 2025-09-27 00:50:59 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:50:59.684233 | orchestrator | 2025-09-27 00:50:59 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:50:59.684924 | orchestrator | 2025-09-27 00:50:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:02.728122 | orchestrator | 2025-09-27 00:51:02 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:02.729986 | orchestrator | 2025-09-27 00:51:02 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:02.731513 | orchestrator | 2025-09-27 00:51:02 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:02.731542 | orchestrator | 2025-09-27 00:51:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:05.780773 | orchestrator | 2025-09-27 00:51:05 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:05.782870 | orchestrator | 2025-09-27 00:51:05 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:05.783957 | orchestrator | 2025-09-27 00:51:05 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:05.783984 | orchestrator | 2025-09-27 00:51:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:08.831948 | orchestrator | 2025-09-27 00:51:08 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:08.834115 | orchestrator | 2025-09-27 00:51:08 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:08.836880 | orchestrator | 2025-09-27 00:51:08 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:08.837094 | orchestrator | 2025-09-27 00:51:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:11.889609 | orchestrator | 2025-09-27 00:51:11 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:11.892678 | orchestrator | 2025-09-27 00:51:11 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:11.895433 | orchestrator | 2025-09-27 00:51:11 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:11.895536 | orchestrator | 2025-09-27 00:51:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:14.937261 | orchestrator | 2025-09-27 00:51:14 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:14.938581 | orchestrator | 2025-09-27 00:51:14 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:14.940527 | orchestrator | 2025-09-27 00:51:14 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:14.941026 | orchestrator | 2025-09-27 00:51:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:17.982820 | orchestrator | 2025-09-27 00:51:17 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:17.983825 | orchestrator | 2025-09-27 00:51:17 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:17.985948 | orchestrator | 2025-09-27 00:51:17 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:17.986281 | orchestrator | 2025-09-27 00:51:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:21.043360 | orchestrator | 2025-09-27 00:51:21 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:21.046238 | orchestrator | 2025-09-27 00:51:21 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:21.047986 | orchestrator | 2025-09-27 00:51:21 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:21.048034 | orchestrator | 2025-09-27 00:51:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:24.091755 | orchestrator | 2025-09-27 00:51:24 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:24.092532 | orchestrator | 2025-09-27 00:51:24 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:24.093830 | orchestrator | 2025-09-27 00:51:24 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:24.093851 | orchestrator | 2025-09-27 00:51:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:27.143215 | orchestrator | 2025-09-27 00:51:27 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:27.145219 | orchestrator | 2025-09-27 00:51:27 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:27.147238 | orchestrator | 2025-09-27 00:51:27 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:27.147263 | orchestrator | 2025-09-27 00:51:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:30.203226 | orchestrator | 2025-09-27 00:51:30 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:30.203948 | orchestrator | 2025-09-27 00:51:30 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:30.205485 | orchestrator | 2025-09-27 00:51:30 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:30.205571 | orchestrator | 2025-09-27 00:51:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:33.260217 | orchestrator | 2025-09-27 00:51:33 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:33.262566 | orchestrator | 2025-09-27 00:51:33 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:33.264733 | orchestrator | 2025-09-27 00:51:33 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:33.264965 | orchestrator | 2025-09-27 00:51:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:36.306427 | orchestrator | 2025-09-27 00:51:36 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:36.308570 | orchestrator | 2025-09-27 00:51:36 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:36.309882 | orchestrator | 2025-09-27 00:51:36 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:36.310126 | orchestrator | 2025-09-27 00:51:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:39.367493 | orchestrator | 2025-09-27 00:51:39 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:39.369523 | orchestrator | 2025-09-27 00:51:39 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:39.371440 | orchestrator | 2025-09-27 00:51:39 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:39.371465 | orchestrator | 2025-09-27 00:51:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:42.418988 | orchestrator | 2025-09-27 00:51:42 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:42.420020 | orchestrator | 2025-09-27 00:51:42 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:42.422287 | orchestrator | 2025-09-27 00:51:42 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:42.422318 | orchestrator | 2025-09-27 00:51:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:45.466565 | orchestrator | 2025-09-27 00:51:45 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:45.466669 | orchestrator | 2025-09-27 00:51:45 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:45.467392 | orchestrator | 2025-09-27 00:51:45 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:45.467420 | orchestrator | 2025-09-27 00:51:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:48.512835 | orchestrator | 2025-09-27 00:51:48 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:48.513213 | orchestrator | 2025-09-27 00:51:48 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:48.516185 | orchestrator | 2025-09-27 00:51:48 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:48.516315 | orchestrator | 2025-09-27 00:51:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:51.562655 | orchestrator | 2025-09-27 00:51:51 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:51.564244 | orchestrator | 2025-09-27 00:51:51 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:51.566502 | orchestrator | 2025-09-27 00:51:51 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:51.566924 | orchestrator | 2025-09-27 00:51:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:54.627838 | orchestrator | 2025-09-27 00:51:54 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:54.630131 | orchestrator | 2025-09-27 00:51:54 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:54.632214 | orchestrator | 2025-09-27 00:51:54 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:54.632286 | orchestrator | 2025-09-27 00:51:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:51:57.678720 | orchestrator | 2025-09-27 00:51:57 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:51:57.683126 | orchestrator | 2025-09-27 00:51:57 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:51:57.685180 | orchestrator | 2025-09-27 00:51:57 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:51:57.685270 | orchestrator | 2025-09-27 00:51:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:00.724507 | orchestrator | 2025-09-27 00:52:00 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:00.725766 | orchestrator | 2025-09-27 00:52:00 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:00.727557 | orchestrator | 2025-09-27 00:52:00 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:00.727586 | orchestrator | 2025-09-27 00:52:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:03.770647 | orchestrator | 2025-09-27 00:52:03 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:03.771844 | orchestrator | 2025-09-27 00:52:03 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:03.773250 | orchestrator | 2025-09-27 00:52:03 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:03.773288 | orchestrator | 2025-09-27 00:52:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:06.826333 | orchestrator | 2025-09-27 00:52:06 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:06.828316 | orchestrator | 2025-09-27 00:52:06 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:06.829652 | orchestrator | 2025-09-27 00:52:06 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:06.830006 | orchestrator | 2025-09-27 00:52:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:09.870254 | orchestrator | 2025-09-27 00:52:09 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:09.871735 | orchestrator | 2025-09-27 00:52:09 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:09.874076 | orchestrator | 2025-09-27 00:52:09 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:09.874397 | orchestrator | 2025-09-27 00:52:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:12.919549 | orchestrator | 2025-09-27 00:52:12 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:12.920839 | orchestrator | 2025-09-27 00:52:12 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:12.922333 | orchestrator | 2025-09-27 00:52:12 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:12.923392 | orchestrator | 2025-09-27 00:52:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:15.975051 | orchestrator | 2025-09-27 00:52:15 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:15.980907 | orchestrator | 2025-09-27 00:52:15 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:15.980954 | orchestrator | 2025-09-27 00:52:15 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:15.980991 | orchestrator | 2025-09-27 00:52:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:19.031725 | orchestrator | 2025-09-27 00:52:19 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:19.034008 | orchestrator | 2025-09-27 00:52:19 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:19.035348 | orchestrator | 2025-09-27 00:52:19 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:19.035444 | orchestrator | 2025-09-27 00:52:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:22.094288 | orchestrator | 2025-09-27 00:52:22 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:22.095311 | orchestrator | 2025-09-27 00:52:22 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:22.096983 | orchestrator | 2025-09-27 00:52:22 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:22.097154 | orchestrator | 2025-09-27 00:52:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:25.142593 | orchestrator | 2025-09-27 00:52:25 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:25.144233 | orchestrator | 2025-09-27 00:52:25 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:25.145806 | orchestrator | 2025-09-27 00:52:25 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:25.145848 | orchestrator | 2025-09-27 00:52:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:28.188351 | orchestrator | 2025-09-27 00:52:28 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:28.190563 | orchestrator | 2025-09-27 00:52:28 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:28.192987 | orchestrator | 2025-09-27 00:52:28 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:28.193012 | orchestrator | 2025-09-27 00:52:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:31.232887 | orchestrator | 2025-09-27 00:52:31 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:31.234732 | orchestrator | 2025-09-27 00:52:31 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:31.236885 | orchestrator | 2025-09-27 00:52:31 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:31.237031 | orchestrator | 2025-09-27 00:52:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:34.278994 | orchestrator | 2025-09-27 00:52:34 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:34.280195 | orchestrator | 2025-09-27 00:52:34 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:34.281522 | orchestrator | 2025-09-27 00:52:34 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:34.281544 | orchestrator | 2025-09-27 00:52:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:37.327891 | orchestrator | 2025-09-27 00:52:37 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:37.328594 | orchestrator | 2025-09-27 00:52:37 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:37.330302 | orchestrator | 2025-09-27 00:52:37 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:37.330651 | orchestrator | 2025-09-27 00:52:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:40.373056 | orchestrator | 2025-09-27 00:52:40 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:40.373668 | orchestrator | 2025-09-27 00:52:40 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:40.375824 | orchestrator | 2025-09-27 00:52:40 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:40.376085 | orchestrator | 2025-09-27 00:52:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:43.426443 | orchestrator | 2025-09-27 00:52:43 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:43.427940 | orchestrator | 2025-09-27 00:52:43 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:43.429654 | orchestrator | 2025-09-27 00:52:43 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:43.429680 | orchestrator | 2025-09-27 00:52:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:46.481000 | orchestrator | 2025-09-27 00:52:46 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:46.481104 | orchestrator | 2025-09-27 00:52:46 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:46.481164 | orchestrator | 2025-09-27 00:52:46 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:46.481177 | orchestrator | 2025-09-27 00:52:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:49.529344 | orchestrator | 2025-09-27 00:52:49 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:49.530755 | orchestrator | 2025-09-27 00:52:49 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:49.532225 | orchestrator | 2025-09-27 00:52:49 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:49.532299 | orchestrator | 2025-09-27 00:52:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:52.578473 | orchestrator | 2025-09-27 00:52:52 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:52.578581 | orchestrator | 2025-09-27 00:52:52 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:52.579059 | orchestrator | 2025-09-27 00:52:52 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:52.579087 | orchestrator | 2025-09-27 00:52:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:55.634304 | orchestrator | 2025-09-27 00:52:55 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:55.635812 | orchestrator | 2025-09-27 00:52:55 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:55.637486 | orchestrator | 2025-09-27 00:52:55 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:55.637666 | orchestrator | 2025-09-27 00:52:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:52:58.682860 | orchestrator | 2025-09-27 00:52:58 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:52:58.684733 | orchestrator | 2025-09-27 00:52:58 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:52:58.686757 | orchestrator | 2025-09-27 00:52:58 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:52:58.687031 | orchestrator | 2025-09-27 00:52:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:01.732474 | orchestrator | 2025-09-27 00:53:01 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:01.734283 | orchestrator | 2025-09-27 00:53:01 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:01.735769 | orchestrator | 2025-09-27 00:53:01 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:01.735874 | orchestrator | 2025-09-27 00:53:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:04.786515 | orchestrator | 2025-09-27 00:53:04 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:04.787873 | orchestrator | 2025-09-27 00:53:04 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:04.789698 | orchestrator | 2025-09-27 00:53:04 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:04.789725 | orchestrator | 2025-09-27 00:53:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:07.837904 | orchestrator | 2025-09-27 00:53:07 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:07.839460 | orchestrator | 2025-09-27 00:53:07 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:07.842190 | orchestrator | 2025-09-27 00:53:07 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:07.842232 | orchestrator | 2025-09-27 00:53:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:10.894203 | orchestrator | 2025-09-27 00:53:10 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:10.894315 | orchestrator | 2025-09-27 00:53:10 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:10.895006 | orchestrator | 2025-09-27 00:53:10 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:10.895266 | orchestrator | 2025-09-27 00:53:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:13.949733 | orchestrator | 2025-09-27 00:53:13 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:13.950804 | orchestrator | 2025-09-27 00:53:13 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:13.952464 | orchestrator | 2025-09-27 00:53:13 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:13.952488 | orchestrator | 2025-09-27 00:53:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:16.997629 | orchestrator | 2025-09-27 00:53:16 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:16.998436 | orchestrator | 2025-09-27 00:53:16 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:16.999285 | orchestrator | 2025-09-27 00:53:16 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:16.999309 | orchestrator | 2025-09-27 00:53:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:20.043326 | orchestrator | 2025-09-27 00:53:20 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:20.044375 | orchestrator | 2025-09-27 00:53:20 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:20.045889 | orchestrator | 2025-09-27 00:53:20 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:20.046066 | orchestrator | 2025-09-27 00:53:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:23.083743 | orchestrator | 2025-09-27 00:53:23 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:23.085841 | orchestrator | 2025-09-27 00:53:23 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:23.087774 | orchestrator | 2025-09-27 00:53:23 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:23.088168 | orchestrator | 2025-09-27 00:53:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:26.125666 | orchestrator | 2025-09-27 00:53:26 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:26.127229 | orchestrator | 2025-09-27 00:53:26 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:26.130443 | orchestrator | 2025-09-27 00:53:26 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:26.130468 | orchestrator | 2025-09-27 00:53:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:29.177204 | orchestrator | 2025-09-27 00:53:29 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:29.179511 | orchestrator | 2025-09-27 00:53:29 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:29.180728 | orchestrator | 2025-09-27 00:53:29 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state STARTED 2025-09-27 00:53:29.182391 | orchestrator | 2025-09-27 00:53:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:32.223172 | orchestrator | 2025-09-27 00:53:32 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:32.224452 | orchestrator | 2025-09-27 00:53:32 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:32.226554 | orchestrator | 2025-09-27 00:53:32 | INFO  | Task 4edf2878-7e82-4d4c-a11c-81223c9994e2 is in state SUCCESS 2025-09-27 00:53:32.226711 | orchestrator | 2025-09-27 00:53:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:32.228326 | orchestrator | 2025-09-27 00:53:32.228364 | orchestrator | 2025-09-27 00:53:32.228376 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:53:32.228388 | orchestrator | 2025-09-27 00:53:32.228399 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:53:32.228411 | orchestrator | Saturday 27 September 2025 00:49:35 +0000 (0:00:00.267) 0:00:00.267 **** 2025-09-27 00:53:32.228423 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:32.228435 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:32.228446 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:32.228457 | orchestrator | 2025-09-27 00:53:32.228468 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:53:32.228479 | orchestrator | Saturday 27 September 2025 00:49:35 +0000 (0:00:00.272) 0:00:00.540 **** 2025-09-27 00:53:32.228490 | orchestrator | ok: [testbed-node-0] => (item=enable_opensearch_True) 2025-09-27 00:53:32.228501 | orchestrator | ok: [testbed-node-1] => (item=enable_opensearch_True) 2025-09-27 00:53:32.228529 | orchestrator | ok: [testbed-node-2] => (item=enable_opensearch_True) 2025-09-27 00:53:32.228540 | orchestrator | 2025-09-27 00:53:32.228551 | orchestrator | PLAY [Apply role opensearch] *************************************************** 2025-09-27 00:53:32.228562 | orchestrator | 2025-09-27 00:53:32.228573 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2025-09-27 00:53:32.228584 | orchestrator | Saturday 27 September 2025 00:49:36 +0000 (0:00:00.389) 0:00:00.930 **** 2025-09-27 00:53:32.228595 | orchestrator | included: /ansible/roles/opensearch/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:32.228606 | orchestrator | 2025-09-27 00:53:32.228617 | orchestrator | TASK [opensearch : Setting sysctl values] ************************************** 2025-09-27 00:53:32.228628 | orchestrator | Saturday 27 September 2025 00:49:36 +0000 (0:00:00.474) 0:00:01.404 **** 2025-09-27 00:53:32.228660 | orchestrator | changed: [testbed-node-0] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-27 00:53:32.228672 | orchestrator | changed: [testbed-node-1] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-27 00:53:32.228682 | orchestrator | changed: [testbed-node-2] => (item={'name': 'vm.max_map_count', 'value': 262144}) 2025-09-27 00:53:32.228693 | orchestrator | 2025-09-27 00:53:32.228703 | orchestrator | TASK [opensearch : Ensuring config directories exist] ************************** 2025-09-27 00:53:32.228714 | orchestrator | Saturday 27 September 2025 00:49:37 +0000 (0:00:00.629) 0:00:02.033 **** 2025-09-27 00:53:32.228728 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.228743 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.228767 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.228788 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.228810 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.228823 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.228835 | orchestrator | 2025-09-27 00:53:32.228846 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2025-09-27 00:53:32.228857 | orchestrator | Saturday 27 September 2025 00:49:38 +0000 (0:00:01.503) 0:00:03.537 **** 2025-09-27 00:53:32.228868 | orchestrator | included: /ansible/roles/opensearch/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:32.228879 | orchestrator | 2025-09-27 00:53:32.228889 | orchestrator | TASK [service-cert-copy : opensearch | Copying over extra CA certificates] ***** 2025-09-27 00:53:32.228900 | orchestrator | Saturday 27 September 2025 00:49:39 +0000 (0:00:00.644) 0:00:04.182 **** 2025-09-27 00:53:32.228919 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.228936 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.228955 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.228967 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.228985 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.229003 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.229021 | orchestrator | 2025-09-27 00:53:32.229033 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS certificate] *** 2025-09-27 00:53:32.229044 | orchestrator | Saturday 27 September 2025 00:49:41 +0000 (0:00:02.421) 0:00:06.604 **** 2025-09-27 00:53:32.229055 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:53:32.229067 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:53:32.229078 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:32.229090 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:53:32.229136 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:53:32.229156 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:32.229168 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:53:32.229180 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:53:32.229191 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:32.229202 | orchestrator | 2025-09-27 00:53:32.229213 | orchestrator | TASK [service-cert-copy : opensearch | Copying over backend internal TLS key] *** 2025-09-27 00:53:32.229224 | orchestrator | Saturday 27 September 2025 00:49:43 +0000 (0:00:01.159) 0:00:07.763 **** 2025-09-27 00:53:32.229235 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:53:32.229259 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:53:32.229282 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:32.229294 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:53:32.229306 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:53:32.229317 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:32.229328 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}})  2025-09-27 00:53:32.229353 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}})  2025-09-27 00:53:32.229371 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:32.229382 | orchestrator | 2025-09-27 00:53:32.229393 | orchestrator | TASK [opensearch : Copying over config.json files for services] **************** 2025-09-27 00:53:32.229404 | orchestrator | Saturday 27 September 2025 00:49:44 +0000 (0:00:01.440) 0:00:09.203 **** 2025-09-27 00:53:32.229415 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.229427 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.229438 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.229457 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.229487 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.229500 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.229512 | orchestrator | 2025-09-27 00:53:32.229523 | orchestrator | TASK [opensearch : Copying over opensearch service config file] **************** 2025-09-27 00:53:32.229534 | orchestrator | Saturday 27 September 2025 00:49:46 +0000 (0:00:01.981) 0:00:11.185 **** 2025-09-27 00:53:32.229545 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:32.229556 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:32.229567 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:32.229578 | orchestrator | 2025-09-27 00:53:32.229589 | orchestrator | TASK [opensearch : Copying over opensearch-dashboards config file] ************* 2025-09-27 00:53:32.229599 | orchestrator | Saturday 27 September 2025 00:49:49 +0000 (0:00:02.645) 0:00:13.830 **** 2025-09-27 00:53:32.229610 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:32.229621 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:32.229632 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:32.229642 | orchestrator | 2025-09-27 00:53:32.229653 | orchestrator | TASK [opensearch : Check opensearch containers] ******************************** 2025-09-27 00:53:32.229664 | orchestrator | Saturday 27 September 2025 00:49:51 +0000 (0:00:02.122) 0:00:15.953 **** 2025-09-27 00:53:32.229681 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.229817 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.229832 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch', 'value': {'container_name': 'opensearch', 'group': 'opensearch', 'enabled': True, 'image': 'registry.osism.tech/kolla/opensearch:2024.2', 'environment': {'OPENSEARCH_JAVA_OPTS': '-Xms1g -Xmx1g -Dlog4j2.formatMsgNoLookups=true'}, 'volumes': ['/etc/kolla/opensearch/:/var/lib/kolla/config_files/', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'opensearch:/var/lib/opensearch/data', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:9200'], 'timeout': '30'}, 'haproxy': {'opensearch': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9200', 'frontend_http_extra': ['option dontlog-normal']}}}}) 2025-09-27 00:53:32.229844 | orchestrator | changed: [testbed-node-0] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.229857 | orchestrator | changed: [testbed-node-1] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.229883 | orchestrator | changed: [testbed-node-2] => (item={'key': 'opensearch-dashboards', 'value': {'container_name': 'opensearch_dashboards', 'group': 'opensearch-dashboards', 'enabled': True, 'environment': {'OPENSEARCH_DASHBOARDS_SECURITY_PLUGIN': 'False'}, 'image': 'registry.osism.tech/kolla/opensearch-dashboards:2024.2', 'volumes': ['/etc/kolla/opensearch-dashboards/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5601'], 'timeout': '30'}, 'haproxy': {'opensearch-dashboards': {'enabled': True, 'mode': 'http', 'external': False, 'port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}, 'opensearch_dashboards_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '5601', 'listen_port': '5601', 'auth_user': 'opensearch', 'auth_pass': 'password'}}}}) 2025-09-27 00:53:32.229895 | orchestrator | 2025-09-27 00:53:32.229911 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2025-09-27 00:53:32.229922 | orchestrator | Saturday 27 September 2025 00:49:53 +0000 (0:00:02.343) 0:00:18.296 **** 2025-09-27 00:53:32.229933 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:32.229944 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:32.229955 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:32.229965 | orchestrator | 2025-09-27 00:53:32.229976 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2025-09-27 00:53:32.229987 | orchestrator | Saturday 27 September 2025 00:49:53 +0000 (0:00:00.332) 0:00:18.629 **** 2025-09-27 00:53:32.229998 | orchestrator | 2025-09-27 00:53:32.230009 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2025-09-27 00:53:32.230066 | orchestrator | Saturday 27 September 2025 00:49:54 +0000 (0:00:00.107) 0:00:18.737 **** 2025-09-27 00:53:32.230079 | orchestrator | 2025-09-27 00:53:32.230090 | orchestrator | TASK [opensearch : Flush handlers] ********************************************* 2025-09-27 00:53:32.230133 | orchestrator | Saturday 27 September 2025 00:49:54 +0000 (0:00:00.068) 0:00:18.805 **** 2025-09-27 00:53:32.230144 | orchestrator | 2025-09-27 00:53:32.230155 | orchestrator | RUNNING HANDLER [opensearch : Disable shard allocation] ************************ 2025-09-27 00:53:32.230165 | orchestrator | Saturday 27 September 2025 00:49:54 +0000 (0:00:00.087) 0:00:18.892 **** 2025-09-27 00:53:32.230176 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:32.230187 | orchestrator | 2025-09-27 00:53:32.230198 | orchestrator | RUNNING HANDLER [opensearch : Perform a flush] ********************************* 2025-09-27 00:53:32.230209 | orchestrator | Saturday 27 September 2025 00:49:54 +0000 (0:00:00.193) 0:00:19.085 **** 2025-09-27 00:53:32.230219 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:32.230230 | orchestrator | 2025-09-27 00:53:32.230241 | orchestrator | RUNNING HANDLER [opensearch : Restart opensearch container] ******************** 2025-09-27 00:53:32.230252 | orchestrator | Saturday 27 September 2025 00:49:55 +0000 (0:00:00.650) 0:00:19.736 **** 2025-09-27 00:53:32.230263 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:32.230273 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:32.230284 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:32.230302 | orchestrator | 2025-09-27 00:53:32.230313 | orchestrator | RUNNING HANDLER [opensearch : Restart opensearch-dashboards container] ********* 2025-09-27 00:53:32.230324 | orchestrator | Saturday 27 September 2025 00:50:57 +0000 (0:01:02.529) 0:01:22.265 **** 2025-09-27 00:53:32.230335 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:32.230345 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:32.230356 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:32.230367 | orchestrator | 2025-09-27 00:53:32.230377 | orchestrator | TASK [opensearch : include_tasks] ********************************************** 2025-09-27 00:53:32.230389 | orchestrator | Saturday 27 September 2025 00:52:15 +0000 (0:01:18.149) 0:02:40.415 **** 2025-09-27 00:53:32.230399 | orchestrator | included: /ansible/roles/opensearch/tasks/post-config.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:32.230410 | orchestrator | 2025-09-27 00:53:32.230421 | orchestrator | TASK [opensearch : Wait for OpenSearch to become ready] ************************ 2025-09-27 00:53:32.230432 | orchestrator | Saturday 27 September 2025 00:52:16 +0000 (0:00:00.521) 0:02:40.936 **** 2025-09-27 00:53:32.230444 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (30 retries left). 2025-09-27 00:53:32.230458 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (29 retries left). 2025-09-27 00:53:32.230471 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (28 retries left). 2025-09-27 00:53:32.230483 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (27 retries left). 2025-09-27 00:53:32.230496 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (26 retries left). 2025-09-27 00:53:32.230508 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (25 retries left). 2025-09-27 00:53:32.230520 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (24 retries left). 2025-09-27 00:53:32.230532 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (23 retries left). 2025-09-27 00:53:32.230559 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (22 retries left). 2025-09-27 00:53:32.230581 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (21 retries left). 2025-09-27 00:53:32.230593 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (20 retries left). 2025-09-27 00:53:32.230606 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (19 retries left). 2025-09-27 00:53:32.230625 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (18 retries left). 2025-09-27 00:53:32.230638 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (17 retries left). 2025-09-27 00:53:32.230650 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (16 retries left). 2025-09-27 00:53:32.230663 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (15 retries left). 2025-09-27 00:53:32.230675 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (14 retries left). 2025-09-27 00:53:32.230687 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (13 retries left). 2025-09-27 00:53:32.230705 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (12 retries left). 2025-09-27 00:53:32.230717 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (11 retries left). 2025-09-27 00:53:32.230730 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (10 retries left). 2025-09-27 00:53:32.230742 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (9 retries left). 2025-09-27 00:53:32.230755 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (8 retries left). 2025-09-27 00:53:32.230774 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (7 retries left). 2025-09-27 00:53:32.230787 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (6 retries left). 2025-09-27 00:53:32.230799 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (5 retries left). 2025-09-27 00:53:32.230810 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (4 retries left). 2025-09-27 00:53:32.230820 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (3 retries left). 2025-09-27 00:53:32.230831 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (2 retries left). 2025-09-27 00:53:32.230842 | orchestrator | FAILED - RETRYING: [testbed-node-0]: Wait for OpenSearch to become ready (1 retries left). 2025-09-27 00:53:32.230853 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"attempts": 30, "changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:53:32.230865 | orchestrator | 2025-09-27 00:53:32.230876 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:53:32.230887 | orchestrator | testbed-node-0 : ok=14  changed=9  unreachable=0 failed=1  skipped=5  rescued=0 ignored=0 2025-09-27 00:53:32.230900 | orchestrator | testbed-node-1 : ok=14  changed=9  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-09-27 00:53:32.230911 | orchestrator | testbed-node-2 : ok=14  changed=9  unreachable=0 failed=0 skipped=3  rescued=0 ignored=0 2025-09-27 00:53:32.230922 | orchestrator | 2025-09-27 00:53:32.230933 | orchestrator | 2025-09-27 00:53:32.230944 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:53:32.230955 | orchestrator | Saturday 27 September 2025 00:53:31 +0000 (0:01:15.580) 0:03:56.516 **** 2025-09-27 00:53:32.230966 | orchestrator | =============================================================================== 2025-09-27 00:53:32.230976 | orchestrator | opensearch : Restart opensearch-dashboards container ------------------- 78.15s 2025-09-27 00:53:32.230987 | orchestrator | opensearch : Wait for OpenSearch to become ready ----------------------- 75.58s 2025-09-27 00:53:32.230998 | orchestrator | opensearch : Restart opensearch container ------------------------------ 62.53s 2025-09-27 00:53:32.231009 | orchestrator | opensearch : Copying over opensearch service config file ---------------- 2.65s 2025-09-27 00:53:32.231019 | orchestrator | service-cert-copy : opensearch | Copying over extra CA certificates ----- 2.42s 2025-09-27 00:53:32.231030 | orchestrator | opensearch : Check opensearch containers -------------------------------- 2.34s 2025-09-27 00:53:32.231041 | orchestrator | opensearch : Copying over opensearch-dashboards config file ------------- 2.12s 2025-09-27 00:53:32.231052 | orchestrator | opensearch : Copying over config.json files for services ---------------- 1.98s 2025-09-27 00:53:32.231062 | orchestrator | opensearch : Ensuring config directories exist -------------------------- 1.50s 2025-09-27 00:53:32.231073 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS key --- 1.44s 2025-09-27 00:53:32.231084 | orchestrator | service-cert-copy : opensearch | Copying over backend internal TLS certificate --- 1.16s 2025-09-27 00:53:32.231155 | orchestrator | opensearch : Perform a flush -------------------------------------------- 0.65s 2025-09-27 00:53:32.231169 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.64s 2025-09-27 00:53:32.231180 | orchestrator | opensearch : Setting sysctl values -------------------------------------- 0.63s 2025-09-27 00:53:32.231190 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.52s 2025-09-27 00:53:32.231201 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.47s 2025-09-27 00:53:32.231217 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.39s 2025-09-27 00:53:32.231235 | orchestrator | opensearch : include_tasks ---------------------------------------------- 0.33s 2025-09-27 00:53:32.231246 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.27s 2025-09-27 00:53:32.231257 | orchestrator | opensearch : Flush handlers --------------------------------------------- 0.26s 2025-09-27 00:53:35.271569 | orchestrator | 2025-09-27 00:53:35 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:35.273753 | orchestrator | 2025-09-27 00:53:35 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:35.274737 | orchestrator | 2025-09-27 00:53:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:38.319967 | orchestrator | 2025-09-27 00:53:38 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:38.320322 | orchestrator | 2025-09-27 00:53:38 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:38.320352 | orchestrator | 2025-09-27 00:53:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:41.361302 | orchestrator | 2025-09-27 00:53:41 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:41.363266 | orchestrator | 2025-09-27 00:53:41 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:41.363300 | orchestrator | 2025-09-27 00:53:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:44.406870 | orchestrator | 2025-09-27 00:53:44 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:44.407905 | orchestrator | 2025-09-27 00:53:44 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:44.407937 | orchestrator | 2025-09-27 00:53:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:47.460504 | orchestrator | 2025-09-27 00:53:47 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:47.461794 | orchestrator | 2025-09-27 00:53:47 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:47.461967 | orchestrator | 2025-09-27 00:53:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:50.511521 | orchestrator | 2025-09-27 00:53:50 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:50.512120 | orchestrator | 2025-09-27 00:53:50 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:50.512201 | orchestrator | 2025-09-27 00:53:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:53.563406 | orchestrator | 2025-09-27 00:53:53 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state STARTED 2025-09-27 00:53:53.564821 | orchestrator | 2025-09-27 00:53:53 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state STARTED 2025-09-27 00:53:53.565350 | orchestrator | 2025-09-27 00:53:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:53:56.614959 | orchestrator | 2025-09-27 00:53:56 | INFO  | Task 625d002c-1b01-4e2f-8d5c-d5ab32c60c73 is in state SUCCESS 2025-09-27 00:53:56.616348 | orchestrator | 2025-09-27 00:53:56.616391 | orchestrator | 2025-09-27 00:53:56.616404 | orchestrator | PLAY [Prepare deployment of Ceph services] ************************************* 2025-09-27 00:53:56.616416 | orchestrator | 2025-09-27 00:53:56.616427 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2025-09-27 00:53:56.616545 | orchestrator | Saturday 27 September 2025 00:43:33 +0000 (0:00:00.838) 0:00:00.838 **** 2025-09-27 00:53:56.616559 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.616597 | orchestrator | 2025-09-27 00:53:56.616610 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2025-09-27 00:53:56.616649 | orchestrator | Saturday 27 September 2025 00:43:34 +0000 (0:00:01.149) 0:00:01.987 **** 2025-09-27 00:53:56.616661 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.616722 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.616736 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.616747 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.616758 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.616768 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.616779 | orchestrator | 2025-09-27 00:53:56.616790 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2025-09-27 00:53:56.616890 | orchestrator | Saturday 27 September 2025 00:43:35 +0000 (0:00:01.542) 0:00:03.530 **** 2025-09-27 00:53:56.616902 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.616913 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.616923 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.616934 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.616946 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.616958 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.616970 | orchestrator | 2025-09-27 00:53:56.616982 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2025-09-27 00:53:56.616995 | orchestrator | Saturday 27 September 2025 00:43:36 +0000 (0:00:00.679) 0:00:04.209 **** 2025-09-27 00:53:56.617007 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.617020 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.617058 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.617071 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.617084 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.617115 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.617126 | orchestrator | 2025-09-27 00:53:56.617137 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2025-09-27 00:53:56.617148 | orchestrator | Saturday 27 September 2025 00:43:37 +0000 (0:00:01.138) 0:00:05.348 **** 2025-09-27 00:53:56.617159 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.617169 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.617180 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.617191 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.617246 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.617256 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.617267 | orchestrator | 2025-09-27 00:53:56.617301 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2025-09-27 00:53:56.617313 | orchestrator | Saturday 27 September 2025 00:43:39 +0000 (0:00:01.317) 0:00:06.666 **** 2025-09-27 00:53:56.617340 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.617351 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.617362 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.617562 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.617575 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.617586 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.617596 | orchestrator | 2025-09-27 00:53:56.617608 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2025-09-27 00:53:56.617691 | orchestrator | Saturday 27 September 2025 00:43:39 +0000 (0:00:00.560) 0:00:07.226 **** 2025-09-27 00:53:56.617704 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.617714 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.617725 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.617736 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.617747 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.617758 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.617768 | orchestrator | 2025-09-27 00:53:56.617779 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2025-09-27 00:53:56.617791 | orchestrator | Saturday 27 September 2025 00:43:40 +0000 (0:00:01.013) 0:00:08.239 **** 2025-09-27 00:53:56.617802 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.617814 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.617825 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.617848 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.617859 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.617870 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.617923 | orchestrator | 2025-09-27 00:53:56.617934 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2025-09-27 00:53:56.617945 | orchestrator | Saturday 27 September 2025 00:43:41 +0000 (0:00:01.046) 0:00:09.285 **** 2025-09-27 00:53:56.617956 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.617967 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.617978 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.617989 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.617999 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.618010 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.618076 | orchestrator | 2025-09-27 00:53:56.618120 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2025-09-27 00:53:56.618157 | orchestrator | Saturday 27 September 2025 00:43:42 +0000 (0:00:00.709) 0:00:09.995 **** 2025-09-27 00:53:56.618170 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-27 00:53:56.618181 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:53:56.618192 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:53:56.618406 | orchestrator | 2025-09-27 00:53:56.618418 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2025-09-27 00:53:56.618429 | orchestrator | Saturday 27 September 2025 00:43:43 +0000 (0:00:00.662) 0:00:10.658 **** 2025-09-27 00:53:56.618472 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.618483 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.618494 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.618505 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.618516 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.618527 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.618538 | orchestrator | 2025-09-27 00:53:56.618565 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2025-09-27 00:53:56.618577 | orchestrator | Saturday 27 September 2025 00:43:44 +0000 (0:00:01.719) 0:00:12.377 **** 2025-09-27 00:53:56.618588 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-27 00:53:56.618599 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:53:56.618622 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:53:56.618634 | orchestrator | 2025-09-27 00:53:56.618645 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2025-09-27 00:53:56.618656 | orchestrator | Saturday 27 September 2025 00:43:48 +0000 (0:00:03.356) 0:00:15.734 **** 2025-09-27 00:53:56.618666 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-27 00:53:56.618678 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-27 00:53:56.618688 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-27 00:53:56.618699 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.618766 | orchestrator | 2025-09-27 00:53:56.618779 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2025-09-27 00:53:56.618789 | orchestrator | Saturday 27 September 2025 00:43:48 +0000 (0:00:00.844) 0:00:16.578 **** 2025-09-27 00:53:56.618843 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.618858 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.618869 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.618892 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.618903 | orchestrator | 2025-09-27 00:53:56.618914 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2025-09-27 00:53:56.618925 | orchestrator | Saturday 27 September 2025 00:43:50 +0000 (0:00:01.150) 0:00:17.728 **** 2025-09-27 00:53:56.618946 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.618960 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.618972 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.618983 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.618994 | orchestrator | 2025-09-27 00:53:56.619005 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2025-09-27 00:53:56.619016 | orchestrator | Saturday 27 September 2025 00:43:50 +0000 (0:00:00.138) 0:00:17.867 **** 2025-09-27 00:53:56.619028 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2025-09-27 00:43:45.565553', 'end': '2025-09-27 00:43:45.886746', 'delta': '0:00:00.321193', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.619052 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2025-09-27 00:43:46.362180', 'end': '2025-09-27 00:43:46.666645', 'delta': '0:00:00.304465', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.619064 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'stdout': '', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2025-09-27 00:43:47.554729', 'end': '2025-09-27 00:43:47.866439', 'delta': '0:00:00.311710', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': [], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.619082 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619153 | orchestrator | 2025-09-27 00:53:56.619164 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2025-09-27 00:53:56.619176 | orchestrator | Saturday 27 September 2025 00:43:50 +0000 (0:00:00.270) 0:00:18.138 **** 2025-09-27 00:53:56.619186 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.619197 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.619208 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.619219 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.619230 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.619240 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.619251 | orchestrator | 2025-09-27 00:53:56.619262 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2025-09-27 00:53:56.619273 | orchestrator | Saturday 27 September 2025 00:43:52 +0000 (0:00:02.459) 0:00:20.597 **** 2025-09-27 00:53:56.619290 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.619301 | orchestrator | 2025-09-27 00:53:56.619311 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2025-09-27 00:53:56.619322 | orchestrator | Saturday 27 September 2025 00:43:53 +0000 (0:00:00.741) 0:00:21.339 **** 2025-09-27 00:53:56.619333 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619344 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.619355 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.619365 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.619376 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.619386 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.619397 | orchestrator | 2025-09-27 00:53:56.619408 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2025-09-27 00:53:56.619419 | orchestrator | Saturday 27 September 2025 00:43:54 +0000 (0:00:01.012) 0:00:22.351 **** 2025-09-27 00:53:56.619429 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619440 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.619450 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.619461 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.619472 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.619482 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.619493 | orchestrator | 2025-09-27 00:53:56.619504 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2025-09-27 00:53:56.619515 | orchestrator | Saturday 27 September 2025 00:43:55 +0000 (0:00:01.122) 0:00:23.474 **** 2025-09-27 00:53:56.619525 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619536 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.619547 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.619557 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.619568 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.619579 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.619590 | orchestrator | 2025-09-27 00:53:56.619600 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2025-09-27 00:53:56.619611 | orchestrator | Saturday 27 September 2025 00:43:56 +0000 (0:00:00.764) 0:00:24.239 **** 2025-09-27 00:53:56.619622 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619632 | orchestrator | 2025-09-27 00:53:56.619643 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2025-09-27 00:53:56.619654 | orchestrator | Saturday 27 September 2025 00:43:56 +0000 (0:00:00.089) 0:00:24.328 **** 2025-09-27 00:53:56.619665 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619675 | orchestrator | 2025-09-27 00:53:56.619705 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2025-09-27 00:53:56.619715 | orchestrator | Saturday 27 September 2025 00:43:56 +0000 (0:00:00.179) 0:00:24.507 **** 2025-09-27 00:53:56.619731 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619741 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.619751 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.619760 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.619770 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.619779 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.619789 | orchestrator | 2025-09-27 00:53:56.619798 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2025-09-27 00:53:56.619828 | orchestrator | Saturday 27 September 2025 00:43:57 +0000 (0:00:00.489) 0:00:24.997 **** 2025-09-27 00:53:56.619839 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619848 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.619857 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.619867 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.619876 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.619885 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.619895 | orchestrator | 2025-09-27 00:53:56.619904 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2025-09-27 00:53:56.619914 | orchestrator | Saturday 27 September 2025 00:43:58 +0000 (0:00:00.909) 0:00:25.906 **** 2025-09-27 00:53:56.619923 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.619933 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.619942 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.619951 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.619961 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.619970 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.619979 | orchestrator | 2025-09-27 00:53:56.619989 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2025-09-27 00:53:56.619998 | orchestrator | Saturday 27 September 2025 00:43:58 +0000 (0:00:00.599) 0:00:26.506 **** 2025-09-27 00:53:56.620008 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.620017 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.620027 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.620036 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.620045 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.620054 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.620064 | orchestrator | 2025-09-27 00:53:56.620073 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2025-09-27 00:53:56.620083 | orchestrator | Saturday 27 September 2025 00:44:00 +0000 (0:00:01.142) 0:00:27.648 **** 2025-09-27 00:53:56.620106 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.620116 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.620125 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.620135 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.620144 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.620153 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.620163 | orchestrator | 2025-09-27 00:53:56.620172 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2025-09-27 00:53:56.620182 | orchestrator | Saturday 27 September 2025 00:44:00 +0000 (0:00:00.731) 0:00:28.380 **** 2025-09-27 00:53:56.620191 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.620201 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.620210 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.620224 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.620234 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.620243 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.620252 | orchestrator | 2025-09-27 00:53:56.620262 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2025-09-27 00:53:56.620272 | orchestrator | Saturday 27 September 2025 00:44:01 +0000 (0:00:00.982) 0:00:29.362 **** 2025-09-27 00:53:56.620286 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.620296 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.620404 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.620416 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.620426 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.620436 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.620445 | orchestrator | 2025-09-27 00:53:56.620455 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2025-09-27 00:53:56.620465 | orchestrator | Saturday 27 September 2025 00:44:02 +0000 (0:00:00.677) 0:00:30.040 **** 2025-09-27 00:53:56.620475 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620486 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620497 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620507 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620525 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620536 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620546 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620556 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620605 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part1', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part14', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part15', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part16', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.620627 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-10-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.620638 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.620648 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620658 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620668 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620793 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620817 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620827 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620837 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620847 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620866 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part1', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part14', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part15', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part16', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.620889 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-08-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.620899 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.620909 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620919 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620929 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620939 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620954 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620964 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620974 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.620990 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621001 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part1', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part14', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part15', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part16', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621017 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-13-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621028 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--025d8a54--72cd--5dfc--843f--2890244ba468-osd--block--025d8a54--72cd--5dfc--843f--2890244ba468', 'dm-uuid-LVM-XHCwvuU1sjTmaK85YDSmR6G7sbpVpAP3JMUNdJjHiYoiRZ0xWzEN1AgLJsu20b10'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621040 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--9ca7935d--e986--5962--b530--505e6c7ac609-osd--block--9ca7935d--e986--5962--b530--505e6c7ac609', 'dm-uuid-LVM-bDhFdDwT54ouiDFbfCRj6kE8iH1XDG316Ib1iSwqIA7E8LyFNu82J4CqDZUvs2si'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621147 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621172 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621182 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621192 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621202 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621212 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.621222 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621240 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621251 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621272 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part1', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part14', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part15', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part16', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621285 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--025d8a54--72cd--5dfc--843f--2890244ba468-osd--block--025d8a54--72cd--5dfc--843f--2890244ba468'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-9rawNn-y563-rGNc-kwv8-GzbT-nvxJ-Bf2wvf', 'scsi-0QEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e', 'scsi-SQEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621302 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--9ca7935d--e986--5962--b530--505e6c7ac609-osd--block--9ca7935d--e986--5962--b530--505e6c7ac609'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-SGXdOj-WXQT-SPT5-jcYu-xrdU-Qh2y-VcSwqS', 'scsi-0QEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696', 'scsi-SQEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621314 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc', 'scsi-SQEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621331 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-13-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621345 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e62f59a6--4044--5e93--b85c--9f8cca280e9f-osd--block--e62f59a6--4044--5e93--b85c--9f8cca280e9f', 'dm-uuid-LVM-2j0R2lsOV7uYm5mBjwcNbbSh1kQKC6aWFs37eraHciH3dG7KX2uUyrR9le9M1Sxc'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621356 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--634a63d2--bd22--5328--9676--28392545ed43-osd--block--634a63d2--bd22--5328--9676--28392545ed43', 'dm-uuid-LVM-UlAlFHjSEGexCmx3gfRFT7UDwjIGr9mS6RTZCSiLBafcbpblurexRxLJlqNlUnkx'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621366 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621376 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621386 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621401 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621417 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.621427 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621437 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621447 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621461 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621477 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part1', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part14', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part15', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part16', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621489 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--e62f59a6--4044--5e93--b85c--9f8cca280e9f-osd--block--e62f59a6--4044--5e93--b85c--9f8cca280e9f'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Me9aue-lek3-JPg0-VYec-326H-ZuKM-XDWaPz', 'scsi-0QEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b', 'scsi-SQEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621506 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--634a63d2--bd22--5328--9676--28392545ed43-osd--block--634a63d2--bd22--5328--9676--28392545ed43'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-MLfaDW-hcSX-UUuz-T6hf-jnwD-2Ymd-7lmoLK', 'scsi-0QEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766', 'scsi-SQEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621521 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408', 'scsi-SQEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621531 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-11-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621541 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.621551 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06-osd--block--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06', 'dm-uuid-LVM-K7Add9racGU2L9Njoe4PiYwcIpDjr05MSre6J2Y3OxofcXM429pZ0szxlstbspnD'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621562 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--26537eb5--d37a--51fe--a7ad--0ae3582304de-osd--block--26537eb5--d37a--51fe--a7ad--0ae3582304de', 'dm-uuid-LVM-9u1Et2K86e5ar3MRSDs86Q4n2GlHJ9LdwOYDbas8h12ne7B2BpRv09Athluk4exq'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621584 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621594 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621604 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621614 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621628 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621638 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621648 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621658 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:53:56.621677 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part1', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part14', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part15', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part16', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621698 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06-osd--block--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-3Sibgw-UrUF-sguV-cbL6-UeGk-LmaD-n7sO1o', 'scsi-0QEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d', 'scsi-SQEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621709 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--26537eb5--d37a--51fe--a7ad--0ae3582304de-osd--block--26537eb5--d37a--51fe--a7ad--0ae3582304de'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-RSp72m-6Gr3-OGR0-2DWW-9yZY-dCTz-l1m7o8', 'scsi-0QEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f', 'scsi-SQEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621719 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706', 'scsi-SQEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621729 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-15-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:53:56.621752 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.621763 | orchestrator | 2025-09-27 00:53:56.621773 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2025-09-27 00:53:56.621782 | orchestrator | Saturday 27 September 2025 00:44:04 +0000 (0:00:01.667) 0:00:31.708 **** 2025-09-27 00:53:56.621793 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621804 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621818 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621829 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621839 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621849 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621874 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621885 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621901 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part1', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part14', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part15', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part16', 'scsi-SQEMU_QEMU_HARDDISK_0d8bc224-c074-480c-a812-f41f761871f6-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621923 | orchestrator | skipping: [testbed-node-0] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-10-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621934 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.621944 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621955 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621965 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621979 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.621989 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622006 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622078 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622144 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622162 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part1', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part14', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part15', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part16', 'scsi-SQEMU_QEMU_HARDDISK_a9409f28-5c60-4825-964c-57b4bb65617a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622181 | orchestrator | skipping: [testbed-node-1] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-08-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622198 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622208 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622218 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622233 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622244 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622260 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622275 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622286 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622302 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part1', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part14', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part15', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part16', 'scsi-SQEMU_QEMU_HARDDISK_6dbfecf9-f5ff-4b34-b71a-2ff17908cbf0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622319 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.622329 | orchestrator | skipping: [testbed-node-2] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'inventory_hostname in groups.get(osd_group_name, [])', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-13-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622346 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--025d8a54--72cd--5dfc--843f--2890244ba468-osd--block--025d8a54--72cd--5dfc--843f--2890244ba468', 'dm-uuid-LVM-XHCwvuU1sjTmaK85YDSmR6G7sbpVpAP3JMUNdJjHiYoiRZ0xWzEN1AgLJsu20b10'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622357 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--9ca7935d--e986--5962--b530--505e6c7ac609-osd--block--9ca7935d--e986--5962--b530--505e6c7ac609', 'dm-uuid-LVM-bDhFdDwT54ouiDFbfCRj6kE8iH1XDG316Ib1iSwqIA7E8LyFNu82J4CqDZUvs2si'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622367 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622382 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622400 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.622410 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622420 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622436 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622447 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622457 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622475 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e62f59a6--4044--5e93--b85c--9f8cca280e9f-osd--block--e62f59a6--4044--5e93--b85c--9f8cca280e9f', 'dm-uuid-LVM-2j0R2lsOV7uYm5mBjwcNbbSh1kQKC6aWFs37eraHciH3dG7KX2uUyrR9le9M1Sxc'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622493 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622503 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--634a63d2--bd22--5328--9676--28392545ed43-osd--block--634a63d2--bd22--5328--9676--28392545ed43', 'dm-uuid-LVM-UlAlFHjSEGexCmx3gfRFT7UDwjIGr9mS6RTZCSiLBafcbpblurexRxLJlqNlUnkx'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622530 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622547 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part1', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part14', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part15', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part16', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622565 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622576 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--025d8a54--72cd--5dfc--843f--2890244ba468-osd--block--025d8a54--72cd--5dfc--843f--2890244ba468'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-9rawNn-y563-rGNc-kwv8-GzbT-nvxJ-Bf2wvf', 'scsi-0QEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e', 'scsi-SQEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622592 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622603 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--9ca7935d--e986--5962--b530--505e6c7ac609-osd--block--9ca7935d--e986--5962--b530--505e6c7ac609'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-SGXdOj-WXQT-SPT5-jcYu-xrdU-Qh2y-VcSwqS', 'scsi-0QEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696', 'scsi-SQEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622618 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc', 'scsi-SQEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622631 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622639 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-13-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622648 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.622668 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06-osd--block--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06', 'dm-uuid-LVM-K7Add9racGU2L9Njoe4PiYwcIpDjr05MSre6J2Y3OxofcXM429pZ0szxlstbspnD'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622677 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622685 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--26537eb5--d37a--51fe--a7ad--0ae3582304de-osd--block--26537eb5--d37a--51fe--a7ad--0ae3582304de', 'dm-uuid-LVM-9u1Et2K86e5ar3MRSDs86Q4n2GlHJ9LdwOYDbas8h12ne7B2BpRv09Athluk4exq'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622702 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622710 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622719 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622732 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622740 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622748 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622760 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622774 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622783 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622791 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622806 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622819 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part1', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part14', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part15', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part16', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622839 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part1', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part14', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part15', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part16', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622849 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06-osd--block--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-3Sibgw-UrUF-sguV-cbL6-UeGk-LmaD-n7sO1o', 'scsi-0QEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d', 'scsi-SQEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622866 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--e62f59a6--4044--5e93--b85c--9f8cca280e9f-osd--block--e62f59a6--4044--5e93--b85c--9f8cca280e9f'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Me9aue-lek3-JPg0-VYec-326H-ZuKM-XDWaPz', 'scsi-0QEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b', 'scsi-SQEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622875 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--634a63d2--bd22--5328--9676--28392545ed43-osd--block--634a63d2--bd22--5328--9676--28392545ed43'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-MLfaDW-hcSX-UUuz-T6hf-jnwD-2Ymd-7lmoLK', 'scsi-0QEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766', 'scsi-SQEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622888 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--26537eb5--d37a--51fe--a7ad--0ae3582304de-osd--block--26537eb5--d37a--51fe--a7ad--0ae3582304de'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-RSp72m-6Gr3-OGR0-2DWW-9yZY-dCTz-l1m7o8', 'scsi-0QEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f', 'scsi-SQEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622897 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408', 'scsi-SQEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622916 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706', 'scsi-SQEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622924 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-11-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622933 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-15-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:53:56.622941 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.622949 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.622957 | orchestrator | 2025-09-27 00:53:56.622965 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2025-09-27 00:53:56.622973 | orchestrator | Saturday 27 September 2025 00:44:05 +0000 (0:00:01.845) 0:00:33.554 **** 2025-09-27 00:53:56.622981 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.622989 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.622997 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.623009 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.623017 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.623025 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.623032 | orchestrator | 2025-09-27 00:53:56.623040 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2025-09-27 00:53:56.623048 | orchestrator | Saturday 27 September 2025 00:44:07 +0000 (0:00:01.454) 0:00:35.008 **** 2025-09-27 00:53:56.623056 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.623064 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.623071 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.623079 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.623100 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.623109 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.623116 | orchestrator | 2025-09-27 00:53:56.623124 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2025-09-27 00:53:56.623132 | orchestrator | Saturday 27 September 2025 00:44:09 +0000 (0:00:01.766) 0:00:36.775 **** 2025-09-27 00:53:56.623145 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.623153 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.623160 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.623168 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.623175 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.623183 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.623191 | orchestrator | 2025-09-27 00:53:56.623199 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2025-09-27 00:53:56.623206 | orchestrator | Saturday 27 September 2025 00:44:10 +0000 (0:00:01.018) 0:00:37.794 **** 2025-09-27 00:53:56.623214 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.623222 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.623229 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.623237 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.623245 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.623252 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.623260 | orchestrator | 2025-09-27 00:53:56.623268 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2025-09-27 00:53:56.623275 | orchestrator | Saturday 27 September 2025 00:44:11 +0000 (0:00:01.187) 0:00:38.981 **** 2025-09-27 00:53:56.623283 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.623291 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.623298 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.623306 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.623314 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.623322 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.623329 | orchestrator | 2025-09-27 00:53:56.623337 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2025-09-27 00:53:56.623345 | orchestrator | Saturday 27 September 2025 00:44:12 +0000 (0:00:01.071) 0:00:40.052 **** 2025-09-27 00:53:56.623353 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.623360 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.623368 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.623376 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.623383 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.623391 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.623399 | orchestrator | 2025-09-27 00:53:56.623410 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2025-09-27 00:53:56.623418 | orchestrator | Saturday 27 September 2025 00:44:13 +0000 (0:00:00.755) 0:00:40.808 **** 2025-09-27 00:53:56.623426 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-27 00:53:56.623434 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2025-09-27 00:53:56.623441 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-0) 2025-09-27 00:53:56.623449 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-0) 2025-09-27 00:53:56.623457 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2025-09-27 00:53:56.623464 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-1) 2025-09-27 00:53:56.623472 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2025-09-27 00:53:56.623480 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-1) 2025-09-27 00:53:56.623487 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2025-09-27 00:53:56.623517 | orchestrator | ok: [testbed-node-2] => (item=testbed-node-2) 2025-09-27 00:53:56.623526 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2025-09-27 00:53:56.623533 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2025-09-27 00:53:56.623541 | orchestrator | ok: [testbed-node-1] => (item=testbed-node-2) 2025-09-27 00:53:56.623549 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2025-09-27 00:53:56.623557 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2025-09-27 00:53:56.623564 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2025-09-27 00:53:56.623572 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2025-09-27 00:53:56.623585 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2025-09-27 00:53:56.623592 | orchestrator | 2025-09-27 00:53:56.623600 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2025-09-27 00:53:56.623608 | orchestrator | Saturday 27 September 2025 00:44:17 +0000 (0:00:04.189) 0:00:44.998 **** 2025-09-27 00:53:56.623616 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-27 00:53:56.623624 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-27 00:53:56.623631 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-27 00:53:56.623639 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.623646 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-0)  2025-09-27 00:53:56.623654 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-1)  2025-09-27 00:53:56.623662 | orchestrator | skipping: [testbed-node-1] => (item=testbed-node-2)  2025-09-27 00:53:56.623669 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.623677 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-0)  2025-09-27 00:53:56.623684 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-1)  2025-09-27 00:53:56.623692 | orchestrator | skipping: [testbed-node-2] => (item=testbed-node-2)  2025-09-27 00:53:56.623700 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.623712 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-27 00:53:56.623720 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-27 00:53:56.623728 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-27 00:53:56.623736 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.623744 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2025-09-27 00:53:56.623751 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2025-09-27 00:53:56.623759 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2025-09-27 00:53:56.623767 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.623775 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2025-09-27 00:53:56.623782 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2025-09-27 00:53:56.623790 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2025-09-27 00:53:56.623798 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.623805 | orchestrator | 2025-09-27 00:53:56.623813 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2025-09-27 00:53:56.623821 | orchestrator | Saturday 27 September 2025 00:44:18 +0000 (0:00:01.060) 0:00:46.059 **** 2025-09-27 00:53:56.623829 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.623836 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.623844 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.623852 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.623860 | orchestrator | 2025-09-27 00:53:56.623868 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2025-09-27 00:53:56.623876 | orchestrator | Saturday 27 September 2025 00:44:19 +0000 (0:00:01.350) 0:00:47.410 **** 2025-09-27 00:53:56.623884 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.623892 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.623900 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.623907 | orchestrator | 2025-09-27 00:53:56.623915 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2025-09-27 00:53:56.623923 | orchestrator | Saturday 27 September 2025 00:44:20 +0000 (0:00:00.789) 0:00:48.200 **** 2025-09-27 00:53:56.623931 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.623939 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.623947 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.623954 | orchestrator | 2025-09-27 00:53:56.623962 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2025-09-27 00:53:56.623975 | orchestrator | Saturday 27 September 2025 00:44:21 +0000 (0:00:00.499) 0:00:48.699 **** 2025-09-27 00:53:56.623983 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.623990 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.623998 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.624006 | orchestrator | 2025-09-27 00:53:56.624018 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2025-09-27 00:53:56.624026 | orchestrator | Saturday 27 September 2025 00:44:21 +0000 (0:00:00.469) 0:00:49.168 **** 2025-09-27 00:53:56.624034 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.624042 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.624050 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.624058 | orchestrator | 2025-09-27 00:53:56.624065 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2025-09-27 00:53:56.624073 | orchestrator | Saturday 27 September 2025 00:44:22 +0000 (0:00:01.440) 0:00:50.608 **** 2025-09-27 00:53:56.624081 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.624104 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.624112 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.624120 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.624127 | orchestrator | 2025-09-27 00:53:56.624135 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2025-09-27 00:53:56.624143 | orchestrator | Saturday 27 September 2025 00:44:23 +0000 (0:00:00.574) 0:00:51.183 **** 2025-09-27 00:53:56.624150 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.624158 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.624166 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.624174 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.624181 | orchestrator | 2025-09-27 00:53:56.624189 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2025-09-27 00:53:56.624197 | orchestrator | Saturday 27 September 2025 00:44:23 +0000 (0:00:00.310) 0:00:51.493 **** 2025-09-27 00:53:56.624204 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.624212 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.624220 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.624227 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.624235 | orchestrator | 2025-09-27 00:53:56.624243 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2025-09-27 00:53:56.624251 | orchestrator | Saturday 27 September 2025 00:44:24 +0000 (0:00:00.409) 0:00:51.903 **** 2025-09-27 00:53:56.624258 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.624266 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.624274 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.624281 | orchestrator | 2025-09-27 00:53:56.624289 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2025-09-27 00:53:56.624297 | orchestrator | Saturday 27 September 2025 00:44:24 +0000 (0:00:00.335) 0:00:52.238 **** 2025-09-27 00:53:56.624305 | orchestrator | ok: [testbed-node-3] => (item=0) 2025-09-27 00:53:56.624313 | orchestrator | ok: [testbed-node-4] => (item=0) 2025-09-27 00:53:56.624320 | orchestrator | ok: [testbed-node-5] => (item=0) 2025-09-27 00:53:56.624328 | orchestrator | 2025-09-27 00:53:56.624336 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2025-09-27 00:53:56.624343 | orchestrator | Saturday 27 September 2025 00:44:25 +0000 (0:00:00.910) 0:00:53.148 **** 2025-09-27 00:53:56.624356 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-27 00:53:56.624364 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:53:56.624372 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:53:56.624380 | orchestrator | ok: [testbed-node-0 -> testbed-node-3(192.168.16.13)] => (item=testbed-node-3) 2025-09-27 00:53:56.624393 | orchestrator | ok: [testbed-node-0 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2025-09-27 00:53:56.624401 | orchestrator | ok: [testbed-node-0 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2025-09-27 00:53:56.624409 | orchestrator | ok: [testbed-node-0 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2025-09-27 00:53:56.624417 | orchestrator | 2025-09-27 00:53:56.624424 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2025-09-27 00:53:56.624432 | orchestrator | Saturday 27 September 2025 00:44:27 +0000 (0:00:01.661) 0:00:54.810 **** 2025-09-27 00:53:56.624440 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-27 00:53:56.624448 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:53:56.624455 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:53:56.624463 | orchestrator | ok: [testbed-node-0 -> testbed-node-3(192.168.16.13)] => (item=testbed-node-3) 2025-09-27 00:53:56.624471 | orchestrator | ok: [testbed-node-0 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2025-09-27 00:53:56.624479 | orchestrator | ok: [testbed-node-0 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2025-09-27 00:53:56.624487 | orchestrator | ok: [testbed-node-0 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2025-09-27 00:53:56.624494 | orchestrator | 2025-09-27 00:53:56.624502 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-27 00:53:56.624510 | orchestrator | Saturday 27 September 2025 00:44:29 +0000 (0:00:02.201) 0:00:57.012 **** 2025-09-27 00:53:56.624518 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.624526 | orchestrator | 2025-09-27 00:53:56.624534 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-27 00:53:56.624542 | orchestrator | Saturday 27 September 2025 00:44:30 +0000 (0:00:01.158) 0:00:58.171 **** 2025-09-27 00:53:56.624554 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-2, testbed-node-1, testbed-node-3, testbed-node-5, testbed-node-4 2025-09-27 00:53:56.624562 | orchestrator | 2025-09-27 00:53:56.624570 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-27 00:53:56.624577 | orchestrator | Saturday 27 September 2025 00:44:31 +0000 (0:00:01.379) 0:00:59.550 **** 2025-09-27 00:53:56.624585 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.624593 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.624601 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.624609 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.624617 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.624625 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.624632 | orchestrator | 2025-09-27 00:53:56.624640 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-27 00:53:56.624648 | orchestrator | Saturday 27 September 2025 00:44:32 +0000 (0:00:00.954) 0:01:00.504 **** 2025-09-27 00:53:56.624656 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.624664 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.624671 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.624679 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.624687 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.624695 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.624703 | orchestrator | 2025-09-27 00:53:56.624711 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-27 00:53:56.624718 | orchestrator | Saturday 27 September 2025 00:44:34 +0000 (0:00:01.232) 0:01:01.737 **** 2025-09-27 00:53:56.624726 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.624734 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.624742 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.624755 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.624763 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.624771 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.624779 | orchestrator | 2025-09-27 00:53:56.624787 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-27 00:53:56.624795 | orchestrator | Saturday 27 September 2025 00:44:35 +0000 (0:00:01.444) 0:01:03.181 **** 2025-09-27 00:53:56.624802 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.624810 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.624818 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.624826 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.624833 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.624841 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.624849 | orchestrator | 2025-09-27 00:53:56.624857 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-27 00:53:56.624865 | orchestrator | Saturday 27 September 2025 00:44:36 +0000 (0:00:01.175) 0:01:04.357 **** 2025-09-27 00:53:56.624872 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.624880 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.624888 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.624896 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.624904 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.624911 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.624919 | orchestrator | 2025-09-27 00:53:56.624927 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-27 00:53:56.624935 | orchestrator | Saturday 27 September 2025 00:44:37 +0000 (0:00:00.677) 0:01:05.035 **** 2025-09-27 00:53:56.624947 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.624955 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.624963 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.624970 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.624978 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.624986 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.624994 | orchestrator | 2025-09-27 00:53:56.625002 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-27 00:53:56.625010 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:00.672) 0:01:05.708 **** 2025-09-27 00:53:56.625017 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.625025 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.625033 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.625041 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.625048 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.625056 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.625064 | orchestrator | 2025-09-27 00:53:56.625072 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-27 00:53:56.625080 | orchestrator | Saturday 27 September 2025 00:44:38 +0000 (0:00:00.530) 0:01:06.238 **** 2025-09-27 00:53:56.625122 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.625131 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.625139 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.625147 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.625155 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.625162 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.625170 | orchestrator | 2025-09-27 00:53:56.625178 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-27 00:53:56.625186 | orchestrator | Saturday 27 September 2025 00:44:40 +0000 (0:00:01.848) 0:01:08.086 **** 2025-09-27 00:53:56.625194 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.625202 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.625210 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.625216 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.625223 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.625230 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.625236 | orchestrator | 2025-09-27 00:53:56.625243 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-27 00:53:56.625255 | orchestrator | Saturday 27 September 2025 00:44:41 +0000 (0:00:00.903) 0:01:08.990 **** 2025-09-27 00:53:56.625261 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.625268 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.625275 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.625281 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.625288 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.625294 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.625301 | orchestrator | 2025-09-27 00:53:56.625307 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-27 00:53:56.625314 | orchestrator | Saturday 27 September 2025 00:44:42 +0000 (0:00:00.686) 0:01:09.677 **** 2025-09-27 00:53:56.625320 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.625327 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.625334 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.625340 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.625350 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.625357 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.625364 | orchestrator | 2025-09-27 00:53:56.625370 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-27 00:53:56.625377 | orchestrator | Saturday 27 September 2025 00:44:42 +0000 (0:00:00.646) 0:01:10.323 **** 2025-09-27 00:53:56.625383 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.625390 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.625397 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.625403 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.625410 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.625416 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.625423 | orchestrator | 2025-09-27 00:53:56.625429 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-27 00:53:56.625436 | orchestrator | Saturday 27 September 2025 00:44:43 +0000 (0:00:01.127) 0:01:11.451 **** 2025-09-27 00:53:56.625443 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.625449 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.625456 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.625462 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.625469 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.625475 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.625482 | orchestrator | 2025-09-27 00:53:56.625489 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-27 00:53:56.625495 | orchestrator | Saturday 27 September 2025 00:44:44 +0000 (0:00:00.769) 0:01:12.220 **** 2025-09-27 00:53:56.625502 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.625508 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.625515 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.625521 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.625528 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.625535 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.625541 | orchestrator | 2025-09-27 00:53:56.625548 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-27 00:53:56.625555 | orchestrator | Saturday 27 September 2025 00:44:45 +0000 (0:00:00.886) 0:01:13.107 **** 2025-09-27 00:53:56.625561 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.625568 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.625574 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.625581 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.625587 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.625594 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.625600 | orchestrator | 2025-09-27 00:53:56.625607 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-27 00:53:56.625613 | orchestrator | Saturday 27 September 2025 00:44:46 +0000 (0:00:00.570) 0:01:13.678 **** 2025-09-27 00:53:56.625620 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.625631 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.625637 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.625644 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.625650 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.625657 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.625664 | orchestrator | 2025-09-27 00:53:56.625670 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-27 00:53:56.625681 | orchestrator | Saturday 27 September 2025 00:44:46 +0000 (0:00:00.674) 0:01:14.353 **** 2025-09-27 00:53:56.625688 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.625694 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.625701 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.625708 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.625714 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.625721 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.625727 | orchestrator | 2025-09-27 00:53:56.625734 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-27 00:53:56.625741 | orchestrator | Saturday 27 September 2025 00:44:47 +0000 (0:00:00.465) 0:01:14.819 **** 2025-09-27 00:53:56.625748 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.625754 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.625761 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.625767 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.625774 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.625781 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.625787 | orchestrator | 2025-09-27 00:53:56.625794 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-27 00:53:56.625801 | orchestrator | Saturday 27 September 2025 00:44:47 +0000 (0:00:00.646) 0:01:15.465 **** 2025-09-27 00:53:56.625807 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.625814 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.625820 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.625827 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.625833 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.625840 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.625846 | orchestrator | 2025-09-27 00:53:56.625853 | orchestrator | TASK [ceph-container-common : Generate systemd ceph target file] *************** 2025-09-27 00:53:56.625860 | orchestrator | Saturday 27 September 2025 00:44:48 +0000 (0:00:00.947) 0:01:16.413 **** 2025-09-27 00:53:56.625866 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.625873 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.625880 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.625886 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.625893 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.625899 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.625906 | orchestrator | 2025-09-27 00:53:56.625913 | orchestrator | TASK [ceph-container-common : Enable ceph.target] ****************************** 2025-09-27 00:53:56.625919 | orchestrator | Saturday 27 September 2025 00:44:50 +0000 (0:00:01.354) 0:01:17.767 **** 2025-09-27 00:53:56.625926 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.625933 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.625939 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.625946 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.625952 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.625959 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.625966 | orchestrator | 2025-09-27 00:53:56.625972 | orchestrator | TASK [ceph-container-common : Include prerequisites.yml] *********************** 2025-09-27 00:53:56.625979 | orchestrator | Saturday 27 September 2025 00:44:52 +0000 (0:00:02.141) 0:01:19.908 **** 2025-09-27 00:53:56.625989 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/prerequisites.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.625996 | orchestrator | 2025-09-27 00:53:56.626003 | orchestrator | TASK [ceph-container-common : Stop lvmetad] ************************************ 2025-09-27 00:53:56.626014 | orchestrator | Saturday 27 September 2025 00:44:53 +0000 (0:00:01.087) 0:01:20.995 **** 2025-09-27 00:53:56.626178 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.626186 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.626193 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.626199 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.626206 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.626213 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.626220 | orchestrator | 2025-09-27 00:53:56.626227 | orchestrator | TASK [ceph-container-common : Disable and mask lvmetad service] **************** 2025-09-27 00:53:56.626234 | orchestrator | Saturday 27 September 2025 00:44:53 +0000 (0:00:00.560) 0:01:21.556 **** 2025-09-27 00:53:56.626241 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.626248 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.626255 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.626261 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.626268 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.626275 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.626282 | orchestrator | 2025-09-27 00:53:56.626289 | orchestrator | TASK [ceph-container-common : Remove ceph udev rules] ************************** 2025-09-27 00:53:56.626296 | orchestrator | Saturday 27 September 2025 00:44:54 +0000 (0:00:00.647) 0:01:22.204 **** 2025-09-27 00:53:56.626302 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-27 00:53:56.626309 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-27 00:53:56.626316 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-27 00:53:56.626323 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-27 00:53:56.626330 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-27 00:53:56.626337 | orchestrator | ok: [testbed-node-2] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-27 00:53:56.626344 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/95-ceph-osd.rules) 2025-09-27 00:53:56.626350 | orchestrator | ok: [testbed-node-1] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-27 00:53:56.626357 | orchestrator | ok: [testbed-node-0] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-27 00:53:56.626364 | orchestrator | ok: [testbed-node-3] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-27 00:53:56.626371 | orchestrator | ok: [testbed-node-4] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-27 00:53:56.626378 | orchestrator | ok: [testbed-node-5] => (item=/usr/lib/udev/rules.d/60-ceph-by-parttypeuuid.rules) 2025-09-27 00:53:56.626385 | orchestrator | 2025-09-27 00:53:56.626401 | orchestrator | TASK [ceph-container-common : Ensure tmpfiles.d is present] ******************** 2025-09-27 00:53:56.626408 | orchestrator | Saturday 27 September 2025 00:44:55 +0000 (0:00:01.280) 0:01:23.484 **** 2025-09-27 00:53:56.626415 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.626422 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.626428 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.626435 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.626442 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.626449 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.626456 | orchestrator | 2025-09-27 00:53:56.626463 | orchestrator | TASK [ceph-container-common : Restore certificates selinux context] ************ 2025-09-27 00:53:56.626469 | orchestrator | Saturday 27 September 2025 00:44:56 +0000 (0:00:01.030) 0:01:24.515 **** 2025-09-27 00:53:56.626476 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.626483 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.626490 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.626497 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.626504 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.626511 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.626526 | orchestrator | 2025-09-27 00:53:56.626534 | orchestrator | TASK [ceph-container-common : Install python3 on osd nodes] ******************** 2025-09-27 00:53:56.626540 | orchestrator | Saturday 27 September 2025 00:44:57 +0000 (0:00:00.605) 0:01:25.120 **** 2025-09-27 00:53:56.626547 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.626554 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.626561 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.626568 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.626575 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.626581 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.626588 | orchestrator | 2025-09-27 00:53:56.626595 | orchestrator | TASK [ceph-container-common : Include registry.yml] **************************** 2025-09-27 00:53:56.626602 | orchestrator | Saturday 27 September 2025 00:44:58 +0000 (0:00:00.613) 0:01:25.733 **** 2025-09-27 00:53:56.626608 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.626616 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.626622 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.626629 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.626636 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.626642 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.626649 | orchestrator | 2025-09-27 00:53:56.626656 | orchestrator | TASK [ceph-container-common : Include fetch_image.yml] ************************* 2025-09-27 00:53:56.626663 | orchestrator | Saturday 27 September 2025 00:44:58 +0000 (0:00:00.510) 0:01:26.244 **** 2025-09-27 00:53:56.626670 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/fetch_image.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.626677 | orchestrator | 2025-09-27 00:53:56.626684 | orchestrator | TASK [ceph-container-common : Pulling Ceph container image] ******************** 2025-09-27 00:53:56.626698 | orchestrator | Saturday 27 September 2025 00:44:59 +0000 (0:00:01.011) 0:01:27.256 **** 2025-09-27 00:53:56.626705 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.626712 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.626719 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.626726 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.626732 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.626739 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.626746 | orchestrator | 2025-09-27 00:53:56.626753 | orchestrator | TASK [ceph-container-common : Pulling alertmanager/prometheus/grafana container images] *** 2025-09-27 00:53:56.626760 | orchestrator | Saturday 27 September 2025 00:45:41 +0000 (0:00:41.656) 0:02:08.913 **** 2025-09-27 00:53:56.626767 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-27 00:53:56.626774 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-27 00:53:56.626782 | orchestrator | skipping: [testbed-node-0] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-27 00:53:56.626790 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.626797 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-27 00:53:56.626805 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-27 00:53:56.626813 | orchestrator | skipping: [testbed-node-1] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-27 00:53:56.626821 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.626828 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-27 00:53:56.626836 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-27 00:53:56.626844 | orchestrator | skipping: [testbed-node-2] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-27 00:53:56.626852 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.626860 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-27 00:53:56.626867 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-27 00:53:56.626881 | orchestrator | skipping: [testbed-node-3] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-27 00:53:56.626888 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.626896 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-27 00:53:56.626904 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-27 00:53:56.626912 | orchestrator | skipping: [testbed-node-4] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-27 00:53:56.626920 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.626928 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/alertmanager:v0.16.2)  2025-09-27 00:53:56.626936 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/prom/prometheus:v2.7.2)  2025-09-27 00:53:56.626944 | orchestrator | skipping: [testbed-node-5] => (item=docker.io/grafana/grafana:6.7.4)  2025-09-27 00:53:56.626956 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.626965 | orchestrator | 2025-09-27 00:53:56.626973 | orchestrator | TASK [ceph-container-common : Pulling node-exporter container image] *********** 2025-09-27 00:53:56.626981 | orchestrator | Saturday 27 September 2025 00:45:41 +0000 (0:00:00.696) 0:02:09.609 **** 2025-09-27 00:53:56.626989 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.626997 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627004 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627012 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627020 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627028 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627036 | orchestrator | 2025-09-27 00:53:56.627043 | orchestrator | TASK [ceph-container-common : Export local ceph dev image] ********************* 2025-09-27 00:53:56.627051 | orchestrator | Saturday 27 September 2025 00:45:42 +0000 (0:00:00.622) 0:02:10.232 **** 2025-09-27 00:53:56.627059 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627067 | orchestrator | 2025-09-27 00:53:56.627075 | orchestrator | TASK [ceph-container-common : Copy ceph dev image file] ************************ 2025-09-27 00:53:56.627083 | orchestrator | Saturday 27 September 2025 00:45:42 +0000 (0:00:00.350) 0:02:10.582 **** 2025-09-27 00:53:56.627127 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627135 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627143 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627150 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627156 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627163 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627170 | orchestrator | 2025-09-27 00:53:56.627176 | orchestrator | TASK [ceph-container-common : Load ceph dev image] ***************************** 2025-09-27 00:53:56.627183 | orchestrator | Saturday 27 September 2025 00:45:43 +0000 (0:00:00.630) 0:02:11.213 **** 2025-09-27 00:53:56.627189 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627196 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627203 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627209 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627216 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627222 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627229 | orchestrator | 2025-09-27 00:53:56.627236 | orchestrator | TASK [ceph-container-common : Remove tmp ceph dev image file] ****************** 2025-09-27 00:53:56.627242 | orchestrator | Saturday 27 September 2025 00:45:44 +0000 (0:00:01.042) 0:02:12.255 **** 2025-09-27 00:53:56.627249 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627256 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627262 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627269 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627275 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627282 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627288 | orchestrator | 2025-09-27 00:53:56.627295 | orchestrator | TASK [ceph-container-common : Get ceph version] ******************************** 2025-09-27 00:53:56.627302 | orchestrator | Saturday 27 September 2025 00:45:45 +0000 (0:00:00.854) 0:02:13.110 **** 2025-09-27 00:53:56.627313 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.627324 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.627331 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.627337 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.627344 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.627351 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.627357 | orchestrator | 2025-09-27 00:53:56.627364 | orchestrator | TASK [ceph-container-common : Set_fact ceph_version ceph_version.stdout.split] *** 2025-09-27 00:53:56.627371 | orchestrator | Saturday 27 September 2025 00:45:49 +0000 (0:00:03.805) 0:02:16.916 **** 2025-09-27 00:53:56.627377 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.627384 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.627391 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.627397 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.627404 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.627411 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.627417 | orchestrator | 2025-09-27 00:53:56.627424 | orchestrator | TASK [ceph-container-common : Include release.yml] ***************************** 2025-09-27 00:53:56.627431 | orchestrator | Saturday 27 September 2025 00:45:49 +0000 (0:00:00.516) 0:02:17.432 **** 2025-09-27 00:53:56.627438 | orchestrator | included: /ansible/roles/ceph-container-common/tasks/release.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.627446 | orchestrator | 2025-09-27 00:53:56.627452 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release jewel] ********************* 2025-09-27 00:53:56.627459 | orchestrator | Saturday 27 September 2025 00:45:50 +0000 (0:00:01.067) 0:02:18.500 **** 2025-09-27 00:53:56.627466 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627472 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627479 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627486 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627492 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627499 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627506 | orchestrator | 2025-09-27 00:53:56.627512 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release kraken] ******************** 2025-09-27 00:53:56.627519 | orchestrator | Saturday 27 September 2025 00:45:51 +0000 (0:00:00.506) 0:02:19.006 **** 2025-09-27 00:53:56.627526 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627532 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627539 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627546 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627552 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627559 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627566 | orchestrator | 2025-09-27 00:53:56.627572 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release luminous] ****************** 2025-09-27 00:53:56.627579 | orchestrator | Saturday 27 September 2025 00:45:52 +0000 (0:00:00.688) 0:02:19.695 **** 2025-09-27 00:53:56.627586 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627592 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627599 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627605 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627612 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627619 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627625 | orchestrator | 2025-09-27 00:53:56.627632 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release mimic] ********************* 2025-09-27 00:53:56.627643 | orchestrator | Saturday 27 September 2025 00:45:52 +0000 (0:00:00.577) 0:02:20.273 **** 2025-09-27 00:53:56.627650 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627656 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627662 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627668 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627674 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627681 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627692 | orchestrator | 2025-09-27 00:53:56.627698 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release nautilus] ****************** 2025-09-27 00:53:56.627704 | orchestrator | Saturday 27 September 2025 00:45:53 +0000 (0:00:00.950) 0:02:21.223 **** 2025-09-27 00:53:56.627710 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627717 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627723 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627729 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627735 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627741 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627747 | orchestrator | 2025-09-27 00:53:56.627753 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release octopus] ******************* 2025-09-27 00:53:56.627759 | orchestrator | Saturday 27 September 2025 00:45:54 +0000 (0:00:01.151) 0:02:22.375 **** 2025-09-27 00:53:56.627766 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627772 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627778 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627784 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627790 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627796 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627802 | orchestrator | 2025-09-27 00:53:56.627808 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release pacific] ******************* 2025-09-27 00:53:56.627815 | orchestrator | Saturday 27 September 2025 00:45:55 +0000 (0:00:01.199) 0:02:23.574 **** 2025-09-27 00:53:56.627821 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627827 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627833 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627839 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627845 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627851 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627857 | orchestrator | 2025-09-27 00:53:56.627864 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release quincy] ******************** 2025-09-27 00:53:56.627870 | orchestrator | Saturday 27 September 2025 00:45:56 +0000 (0:00:00.746) 0:02:24.321 **** 2025-09-27 00:53:56.627876 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.627882 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.627888 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.627894 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.627900 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.627907 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.627913 | orchestrator | 2025-09-27 00:53:56.627919 | orchestrator | TASK [ceph-container-common : Set_fact ceph_release reef] ********************** 2025-09-27 00:53:56.627928 | orchestrator | Saturday 27 September 2025 00:45:57 +0000 (0:00:00.839) 0:02:25.161 **** 2025-09-27 00:53:56.627935 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.627941 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.627947 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.627954 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.627960 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.627966 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.627972 | orchestrator | 2025-09-27 00:53:56.627978 | orchestrator | TASK [ceph-config : Include create_ceph_initial_dirs.yml] ********************** 2025-09-27 00:53:56.627985 | orchestrator | Saturday 27 September 2025 00:45:58 +0000 (0:00:01.049) 0:02:26.210 **** 2025-09-27 00:53:56.627991 | orchestrator | included: /ansible/roles/ceph-config/tasks/create_ceph_initial_dirs.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.627997 | orchestrator | 2025-09-27 00:53:56.628003 | orchestrator | TASK [ceph-config : Create ceph initial directories] *************************** 2025-09-27 00:53:56.628010 | orchestrator | Saturday 27 September 2025 00:45:59 +0000 (0:00:00.988) 0:02:27.199 **** 2025-09-27 00:53:56.628016 | orchestrator | changed: [testbed-node-0] => (item=/etc/ceph) 2025-09-27 00:53:56.628022 | orchestrator | changed: [testbed-node-1] => (item=/etc/ceph) 2025-09-27 00:53:56.628032 | orchestrator | changed: [testbed-node-2] => (item=/etc/ceph) 2025-09-27 00:53:56.628038 | orchestrator | changed: [testbed-node-3] => (item=/etc/ceph) 2025-09-27 00:53:56.628045 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/) 2025-09-27 00:53:56.628051 | orchestrator | changed: [testbed-node-4] => (item=/etc/ceph) 2025-09-27 00:53:56.628057 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/) 2025-09-27 00:53:56.628063 | orchestrator | changed: [testbed-node-5] => (item=/etc/ceph) 2025-09-27 00:53:56.628069 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/) 2025-09-27 00:53:56.628075 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/) 2025-09-27 00:53:56.628082 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mon) 2025-09-27 00:53:56.628099 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mon) 2025-09-27 00:53:56.628105 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/) 2025-09-27 00:53:56.628111 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/) 2025-09-27 00:53:56.628117 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mon) 2025-09-27 00:53:56.628123 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mon) 2025-09-27 00:53:56.628129 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/osd) 2025-09-27 00:53:56.628135 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/osd) 2025-09-27 00:53:56.628142 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mon) 2025-09-27 00:53:56.628148 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mon) 2025-09-27 00:53:56.628154 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/osd) 2025-09-27 00:53:56.628164 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/osd) 2025-09-27 00:53:56.628170 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/mds) 2025-09-27 00:53:56.628176 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/mds) 2025-09-27 00:53:56.628183 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds) 2025-09-27 00:53:56.628189 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/mds) 2025-09-27 00:53:56.628195 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/tmp) 2025-09-27 00:53:56.628201 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/tmp) 2025-09-27 00:53:56.628207 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/osd) 2025-09-27 00:53:56.628213 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/osd) 2025-09-27 00:53:56.628219 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/tmp) 2025-09-27 00:53:56.628225 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/tmp) 2025-09-27 00:53:56.628232 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/crash) 2025-09-27 00:53:56.628238 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/crash) 2025-09-27 00:53:56.628244 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds) 2025-09-27 00:53:56.628250 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds) 2025-09-27 00:53:56.628256 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/crash) 2025-09-27 00:53:56.628262 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/radosgw) 2025-09-27 00:53:56.628269 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/radosgw) 2025-09-27 00:53:56.628275 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/crash) 2025-09-27 00:53:56.628281 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/tmp) 2025-09-27 00:53:56.628287 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/radosgw) 2025-09-27 00:53:56.628293 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/tmp) 2025-09-27 00:53:56.628299 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-27 00:53:56.628306 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-27 00:53:56.628312 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/radosgw) 2025-09-27 00:53:56.628322 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-27 00:53:56.628328 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/crash) 2025-09-27 00:53:56.628334 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-27 00:53:56.628340 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/crash) 2025-09-27 00:53:56.628347 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-27 00:53:56.628356 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-27 00:53:56.628362 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-27 00:53:56.628368 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/radosgw) 2025-09-27 00:53:56.628375 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-27 00:53:56.628381 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-27 00:53:56.628387 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/radosgw) 2025-09-27 00:53:56.628393 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-27 00:53:56.628399 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-27 00:53:56.628405 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-27 00:53:56.628411 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-27 00:53:56.628417 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-27 00:53:56.628424 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rgw) 2025-09-27 00:53:56.628430 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-27 00:53:56.628436 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-27 00:53:56.628442 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-27 00:53:56.628448 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-27 00:53:56.628454 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-27 00:53:56.628460 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mgr) 2025-09-27 00:53:56.628467 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-27 00:53:56.628473 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-27 00:53:56.628479 | orchestrator | changed: [testbed-node-1] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-27 00:53:56.628485 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-27 00:53:56.628491 | orchestrator | changed: [testbed-node-0] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-27 00:53:56.628497 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds) 2025-09-27 00:53:56.628504 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-27 00:53:56.628510 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-27 00:53:56.628516 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-27 00:53:56.628522 | orchestrator | changed: [testbed-node-1] => (item=/var/run/ceph) 2025-09-27 00:53:56.628531 | orchestrator | changed: [testbed-node-0] => (item=/var/run/ceph) 2025-09-27 00:53:56.628538 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd) 2025-09-27 00:53:56.628544 | orchestrator | changed: [testbed-node-2] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-27 00:53:56.628550 | orchestrator | changed: [testbed-node-3] => (item=/var/run/ceph) 2025-09-27 00:53:56.628556 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-27 00:53:56.628563 | orchestrator | changed: [testbed-node-1] => (item=/var/log/ceph) 2025-09-27 00:53:56.628569 | orchestrator | changed: [testbed-node-0] => (item=/var/log/ceph) 2025-09-27 00:53:56.628579 | orchestrator | changed: [testbed-node-3] => (item=/var/log/ceph) 2025-09-27 00:53:56.628585 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd) 2025-09-27 00:53:56.628591 | orchestrator | changed: [testbed-node-2] => (item=/var/run/ceph) 2025-09-27 00:53:56.628598 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-27 00:53:56.628604 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-rbd-mirror) 2025-09-27 00:53:56.628610 | orchestrator | changed: [testbed-node-2] => (item=/var/log/ceph) 2025-09-27 00:53:56.628616 | orchestrator | changed: [testbed-node-5] => (item=/var/run/ceph) 2025-09-27 00:53:56.628622 | orchestrator | changed: [testbed-node-4] => (item=/var/run/ceph) 2025-09-27 00:53:56.628628 | orchestrator | changed: [testbed-node-5] => (item=/var/log/ceph) 2025-09-27 00:53:56.628634 | orchestrator | changed: [testbed-node-4] => (item=/var/log/ceph) 2025-09-27 00:53:56.628640 | orchestrator | 2025-09-27 00:53:56.628647 | orchestrator | TASK [ceph-config : Include_tasks rgw_systemd_environment_file.yml] ************ 2025-09-27 00:53:56.628653 | orchestrator | Saturday 27 September 2025 00:46:06 +0000 (0:00:06.906) 0:02:34.105 **** 2025-09-27 00:53:56.628659 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.628665 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.628671 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.628678 | orchestrator | included: /ansible/roles/ceph-config/tasks/rgw_systemd_environment_file.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.628684 | orchestrator | 2025-09-27 00:53:56.628690 | orchestrator | TASK [ceph-config : Create rados gateway instance directories] ***************** 2025-09-27 00:53:56.628696 | orchestrator | Saturday 27 September 2025 00:46:07 +0000 (0:00:01.029) 0:02:35.135 **** 2025-09-27 00:53:56.628702 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.628709 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.628719 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.628725 | orchestrator | 2025-09-27 00:53:56.628731 | orchestrator | TASK [ceph-config : Generate environment file] ********************************* 2025-09-27 00:53:56.628737 | orchestrator | Saturday 27 September 2025 00:46:08 +0000 (0:00:00.663) 0:02:35.799 **** 2025-09-27 00:53:56.628744 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.628750 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.628756 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.628762 | orchestrator | 2025-09-27 00:53:56.628769 | orchestrator | TASK [ceph-config : Reset num_osds] ******************************************** 2025-09-27 00:53:56.628775 | orchestrator | Saturday 27 September 2025 00:46:09 +0000 (0:00:01.272) 0:02:37.071 **** 2025-09-27 00:53:56.628781 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.628787 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.628793 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.628799 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.628806 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.628812 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.628818 | orchestrator | 2025-09-27 00:53:56.628824 | orchestrator | TASK [ceph-config : Count number of osds for lvm scenario] ********************* 2025-09-27 00:53:56.628830 | orchestrator | Saturday 27 September 2025 00:46:10 +0000 (0:00:00.617) 0:02:37.689 **** 2025-09-27 00:53:56.628836 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.628843 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.628852 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.628858 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.628865 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.628871 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.628877 | orchestrator | 2025-09-27 00:53:56.628883 | orchestrator | TASK [ceph-config : Look up for ceph-volume rejected devices] ****************** 2025-09-27 00:53:56.628889 | orchestrator | Saturday 27 September 2025 00:46:10 +0000 (0:00:00.832) 0:02:38.521 **** 2025-09-27 00:53:56.628895 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.628901 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.628908 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.628914 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.628920 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.628926 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.628932 | orchestrator | 2025-09-27 00:53:56.628938 | orchestrator | TASK [ceph-config : Set_fact rejected_devices] ********************************* 2025-09-27 00:53:56.628945 | orchestrator | Saturday 27 September 2025 00:46:11 +0000 (0:00:00.576) 0:02:39.098 **** 2025-09-27 00:53:56.628951 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.628957 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.628966 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.628973 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.628979 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.628985 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.628991 | orchestrator | 2025-09-27 00:53:56.628997 | orchestrator | TASK [ceph-config : Set_fact _devices] ***************************************** 2025-09-27 00:53:56.629003 | orchestrator | Saturday 27 September 2025 00:46:12 +0000 (0:00:00.838) 0:02:39.937 **** 2025-09-27 00:53:56.629009 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629016 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629022 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629028 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629034 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629040 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629046 | orchestrator | 2025-09-27 00:53:56.629052 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm batch --report' to see how many osds are to be created] *** 2025-09-27 00:53:56.629059 | orchestrator | Saturday 27 September 2025 00:46:13 +0000 (0:00:00.738) 0:02:40.675 **** 2025-09-27 00:53:56.629065 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629071 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629077 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629083 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629100 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629107 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629113 | orchestrator | 2025-09-27 00:53:56.629119 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (legacy report)] *** 2025-09-27 00:53:56.629125 | orchestrator | Saturday 27 September 2025 00:46:13 +0000 (0:00:00.599) 0:02:41.275 **** 2025-09-27 00:53:56.629131 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629137 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629143 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629150 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629156 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629162 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629168 | orchestrator | 2025-09-27 00:53:56.629174 | orchestrator | TASK [ceph-config : Set_fact num_osds from the output of 'ceph-volume lvm batch --report' (new report)] *** 2025-09-27 00:53:56.629180 | orchestrator | Saturday 27 September 2025 00:46:14 +0000 (0:00:00.675) 0:02:41.950 **** 2025-09-27 00:53:56.629186 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629193 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629199 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629209 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629215 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629221 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629227 | orchestrator | 2025-09-27 00:53:56.629233 | orchestrator | TASK [ceph-config : Run 'ceph-volume lvm list' to see how many osds have already been created] *** 2025-09-27 00:53:56.629240 | orchestrator | Saturday 27 September 2025 00:46:14 +0000 (0:00:00.571) 0:02:42.521 **** 2025-09-27 00:53:56.629246 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629255 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629261 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629268 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.629274 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.629280 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.629286 | orchestrator | 2025-09-27 00:53:56.629292 | orchestrator | TASK [ceph-config : Set_fact num_osds (add existing osds)] ********************* 2025-09-27 00:53:56.629298 | orchestrator | Saturday 27 September 2025 00:46:18 +0000 (0:00:03.315) 0:02:45.836 **** 2025-09-27 00:53:56.629305 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629311 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629317 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629323 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.629329 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.629335 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.629341 | orchestrator | 2025-09-27 00:53:56.629347 | orchestrator | TASK [ceph-config : Set_fact _osd_memory_target] ******************************* 2025-09-27 00:53:56.629354 | orchestrator | Saturday 27 September 2025 00:46:18 +0000 (0:00:00.477) 0:02:46.314 **** 2025-09-27 00:53:56.629360 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629366 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629372 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629378 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.629384 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.629390 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.629397 | orchestrator | 2025-09-27 00:53:56.629403 | orchestrator | TASK [ceph-config : Set osd_memory_target to cluster host config] ************** 2025-09-27 00:53:56.629409 | orchestrator | Saturday 27 September 2025 00:46:19 +0000 (0:00:00.696) 0:02:47.010 **** 2025-09-27 00:53:56.629415 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629421 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629427 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629433 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629439 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629446 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629452 | orchestrator | 2025-09-27 00:53:56.629458 | orchestrator | TASK [ceph-config : Render rgw configs] **************************************** 2025-09-27 00:53:56.629464 | orchestrator | Saturday 27 September 2025 00:46:19 +0000 (0:00:00.576) 0:02:47.587 **** 2025-09-27 00:53:56.629470 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629476 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629483 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629489 | orchestrator | ok: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.629495 | orchestrator | ok: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.629501 | orchestrator | ok: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.629507 | orchestrator | 2025-09-27 00:53:56.629514 | orchestrator | TASK [ceph-config : Set config to cluster] ************************************* 2025-09-27 00:53:56.629522 | orchestrator | Saturday 27 September 2025 00:46:20 +0000 (0:00:00.794) 0:02:48.381 **** 2025-09-27 00:53:56.629529 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629535 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629545 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629552 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log'}])  2025-09-27 00:53:56.629561 | orchestrator | skipping: [testbed-node-3] => (item=[{'key': 'client.rgw.default.testbed-node-3.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-3.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.13:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.13:8081'}])  2025-09-27 00:53:56.629569 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629575 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log'}])  2025-09-27 00:53:56.629582 | orchestrator | skipping: [testbed-node-4] => (item=[{'key': 'client.rgw.default.testbed-node-4.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-4.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.14:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.14:8081'}])  2025-09-27 00:53:56.629588 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629594 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'log_file', 'value': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log'}])  2025-09-27 00:53:56.629606 | orchestrator | skipping: [testbed-node-5] => (item=[{'key': 'client.rgw.default.testbed-node-5.rgw0', 'value': {'log_file': '/var/log/ceph/ceph-rgw-default-testbed-node-5.rgw0.log', 'rgw_frontends': 'beast endpoint=192.168.16.15:8081'}}, {'key': 'rgw_frontends', 'value': 'beast endpoint=192.168.16.15:8081'}])  2025-09-27 00:53:56.629613 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629619 | orchestrator | 2025-09-27 00:53:56.629626 | orchestrator | TASK [ceph-config : Set rgw configs to file] *********************************** 2025-09-27 00:53:56.629632 | orchestrator | Saturday 27 September 2025 00:46:21 +0000 (0:00:00.784) 0:02:49.165 **** 2025-09-27 00:53:56.629638 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629644 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629650 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629656 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629663 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629669 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629675 | orchestrator | 2025-09-27 00:53:56.629681 | orchestrator | TASK [ceph-config : Create ceph conf directory] ******************************** 2025-09-27 00:53:56.629687 | orchestrator | Saturday 27 September 2025 00:46:22 +0000 (0:00:00.932) 0:02:50.098 **** 2025-09-27 00:53:56.629693 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629699 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629706 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629712 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629718 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629724 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629730 | orchestrator | 2025-09-27 00:53:56.629736 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2025-09-27 00:53:56.629743 | orchestrator | Saturday 27 September 2025 00:46:23 +0000 (0:00:00.874) 0:02:50.973 **** 2025-09-27 00:53:56.629749 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629755 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629761 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629771 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629777 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629783 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629789 | orchestrator | 2025-09-27 00:53:56.629796 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2025-09-27 00:53:56.629802 | orchestrator | Saturday 27 September 2025 00:46:24 +0000 (0:00:00.797) 0:02:51.771 **** 2025-09-27 00:53:56.629808 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629814 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629820 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629826 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629832 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629838 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629845 | orchestrator | 2025-09-27 00:53:56.629851 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2025-09-27 00:53:56.629857 | orchestrator | Saturday 27 September 2025 00:46:24 +0000 (0:00:00.528) 0:02:52.299 **** 2025-09-27 00:53:56.629863 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629869 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629875 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629884 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.629891 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.629897 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.629903 | orchestrator | 2025-09-27 00:53:56.629909 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2025-09-27 00:53:56.629915 | orchestrator | Saturday 27 September 2025 00:46:25 +0000 (0:00:00.822) 0:02:53.122 **** 2025-09-27 00:53:56.629922 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.629928 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.629934 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.629940 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.629946 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.629952 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.629959 | orchestrator | 2025-09-27 00:53:56.629965 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2025-09-27 00:53:56.629971 | orchestrator | Saturday 27 September 2025 00:46:26 +0000 (0:00:00.740) 0:02:53.862 **** 2025-09-27 00:53:56.629977 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2025-09-27 00:53:56.629983 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2025-09-27 00:53:56.629990 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2025-09-27 00:53:56.629996 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.630002 | orchestrator | 2025-09-27 00:53:56.630008 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2025-09-27 00:53:56.630040 | orchestrator | Saturday 27 September 2025 00:46:26 +0000 (0:00:00.551) 0:02:54.414 **** 2025-09-27 00:53:56.630048 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2025-09-27 00:53:56.630054 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2025-09-27 00:53:56.630060 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2025-09-27 00:53:56.630066 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.630073 | orchestrator | 2025-09-27 00:53:56.630079 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2025-09-27 00:53:56.630111 | orchestrator | Saturday 27 September 2025 00:46:27 +0000 (0:00:00.613) 0:02:55.027 **** 2025-09-27 00:53:56.630119 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-3)  2025-09-27 00:53:56.630125 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-4)  2025-09-27 00:53:56.630131 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-5)  2025-09-27 00:53:56.630138 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.630144 | orchestrator | 2025-09-27 00:53:56.630150 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2025-09-27 00:53:56.630162 | orchestrator | Saturday 27 September 2025 00:46:27 +0000 (0:00:00.360) 0:02:55.388 **** 2025-09-27 00:53:56.630168 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.630174 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.630180 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.630187 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.630193 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.630199 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.630205 | orchestrator | 2025-09-27 00:53:56.630211 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2025-09-27 00:53:56.630218 | orchestrator | Saturday 27 September 2025 00:46:28 +0000 (0:00:00.540) 0:02:55.928 **** 2025-09-27 00:53:56.630224 | orchestrator | skipping: [testbed-node-0] => (item=0)  2025-09-27 00:53:56.630230 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.630236 | orchestrator | skipping: [testbed-node-1] => (item=0)  2025-09-27 00:53:56.630242 | orchestrator | skipping: [testbed-node-2] => (item=0)  2025-09-27 00:53:56.630248 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.630254 | orchestrator | ok: [testbed-node-3] => (item=0) 2025-09-27 00:53:56.630261 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.630267 | orchestrator | ok: [testbed-node-4] => (item=0) 2025-09-27 00:53:56.630273 | orchestrator | ok: [testbed-node-5] => (item=0) 2025-09-27 00:53:56.630280 | orchestrator | 2025-09-27 00:53:56.630286 | orchestrator | TASK [ceph-config : Generate Ceph file] **************************************** 2025-09-27 00:53:56.630292 | orchestrator | Saturday 27 September 2025 00:46:30 +0000 (0:00:01.748) 0:02:57.677 **** 2025-09-27 00:53:56.630298 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.630305 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.630311 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.630317 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.630323 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.630329 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.630335 | orchestrator | 2025-09-27 00:53:56.630341 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-27 00:53:56.630347 | orchestrator | Saturday 27 September 2025 00:46:33 +0000 (0:00:03.178) 0:03:00.855 **** 2025-09-27 00:53:56.630354 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.630360 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.630366 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.630372 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.630378 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.630384 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.630390 | orchestrator | 2025-09-27 00:53:56.630396 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2025-09-27 00:53:56.630403 | orchestrator | Saturday 27 September 2025 00:46:34 +0000 (0:00:01.033) 0:03:01.889 **** 2025-09-27 00:53:56.630409 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630415 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.630421 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.630427 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.630433 | orchestrator | 2025-09-27 00:53:56.630440 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2025-09-27 00:53:56.630446 | orchestrator | Saturday 27 September 2025 00:46:35 +0000 (0:00:01.108) 0:03:02.997 **** 2025-09-27 00:53:56.630452 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.630458 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.630464 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.630471 | orchestrator | 2025-09-27 00:53:56.630477 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2025-09-27 00:53:56.630492 | orchestrator | Saturday 27 September 2025 00:46:35 +0000 (0:00:00.382) 0:03:03.379 **** 2025-09-27 00:53:56.630499 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.630505 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.630516 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.630522 | orchestrator | 2025-09-27 00:53:56.630528 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2025-09-27 00:53:56.630535 | orchestrator | Saturday 27 September 2025 00:46:37 +0000 (0:00:01.272) 0:03:04.652 **** 2025-09-27 00:53:56.630541 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-27 00:53:56.630547 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-27 00:53:56.630553 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-27 00:53:56.630559 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.630565 | orchestrator | 2025-09-27 00:53:56.630571 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2025-09-27 00:53:56.630578 | orchestrator | Saturday 27 September 2025 00:46:37 +0000 (0:00:00.878) 0:03:05.531 **** 2025-09-27 00:53:56.630584 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.630590 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.630596 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.630602 | orchestrator | 2025-09-27 00:53:56.630609 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2025-09-27 00:53:56.630615 | orchestrator | Saturday 27 September 2025 00:46:38 +0000 (0:00:00.509) 0:03:06.041 **** 2025-09-27 00:53:56.630621 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.630627 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.630633 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.630639 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.630644 | orchestrator | 2025-09-27 00:53:56.630650 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2025-09-27 00:53:56.630655 | orchestrator | Saturday 27 September 2025 00:46:39 +0000 (0:00:00.840) 0:03:06.881 **** 2025-09-27 00:53:56.630661 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.630666 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.630671 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.630677 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630682 | orchestrator | 2025-09-27 00:53:56.630687 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2025-09-27 00:53:56.630709 | orchestrator | Saturday 27 September 2025 00:46:39 +0000 (0:00:00.609) 0:03:07.491 **** 2025-09-27 00:53:56.630755 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630767 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.630772 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.630778 | orchestrator | 2025-09-27 00:53:56.630783 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2025-09-27 00:53:56.630792 | orchestrator | Saturday 27 September 2025 00:46:40 +0000 (0:00:00.639) 0:03:08.130 **** 2025-09-27 00:53:56.630797 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630803 | orchestrator | 2025-09-27 00:53:56.630808 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2025-09-27 00:53:56.630813 | orchestrator | Saturday 27 September 2025 00:46:40 +0000 (0:00:00.225) 0:03:08.356 **** 2025-09-27 00:53:56.630819 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630824 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.630829 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.630834 | orchestrator | 2025-09-27 00:53:56.630840 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2025-09-27 00:53:56.630845 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:00.329) 0:03:08.686 **** 2025-09-27 00:53:56.630850 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630856 | orchestrator | 2025-09-27 00:53:56.630861 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2025-09-27 00:53:56.630867 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:00.224) 0:03:08.910 **** 2025-09-27 00:53:56.630876 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630882 | orchestrator | 2025-09-27 00:53:56.630887 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2025-09-27 00:53:56.630892 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:00.201) 0:03:09.112 **** 2025-09-27 00:53:56.630898 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630903 | orchestrator | 2025-09-27 00:53:56.630908 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2025-09-27 00:53:56.630914 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:00.117) 0:03:09.230 **** 2025-09-27 00:53:56.630919 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630924 | orchestrator | 2025-09-27 00:53:56.630930 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2025-09-27 00:53:56.630935 | orchestrator | Saturday 27 September 2025 00:46:41 +0000 (0:00:00.221) 0:03:09.451 **** 2025-09-27 00:53:56.630940 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630945 | orchestrator | 2025-09-27 00:53:56.630951 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2025-09-27 00:53:56.630956 | orchestrator | Saturday 27 September 2025 00:46:42 +0000 (0:00:00.213) 0:03:09.665 **** 2025-09-27 00:53:56.630961 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.630967 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.630972 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.630977 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.630983 | orchestrator | 2025-09-27 00:53:56.630988 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2025-09-27 00:53:56.630993 | orchestrator | Saturday 27 September 2025 00:46:42 +0000 (0:00:00.719) 0:03:10.384 **** 2025-09-27 00:53:56.630999 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.631004 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.631009 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.631015 | orchestrator | 2025-09-27 00:53:56.631024 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2025-09-27 00:53:56.631030 | orchestrator | Saturday 27 September 2025 00:46:43 +0000 (0:00:00.687) 0:03:11.071 **** 2025-09-27 00:53:56.631035 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.631041 | orchestrator | 2025-09-27 00:53:56.631046 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2025-09-27 00:53:56.631051 | orchestrator | Saturday 27 September 2025 00:46:43 +0000 (0:00:00.239) 0:03:11.311 **** 2025-09-27 00:53:56.631057 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.631062 | orchestrator | 2025-09-27 00:53:56.631067 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2025-09-27 00:53:56.631073 | orchestrator | Saturday 27 September 2025 00:46:43 +0000 (0:00:00.224) 0:03:11.536 **** 2025-09-27 00:53:56.631078 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631083 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.631100 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.631106 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-5, testbed-node-4 2025-09-27 00:53:56.631111 | orchestrator | 2025-09-27 00:53:56.631117 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2025-09-27 00:53:56.631122 | orchestrator | Saturday 27 September 2025 00:46:45 +0000 (0:00:01.121) 0:03:12.657 **** 2025-09-27 00:53:56.631127 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.631133 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.631138 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.631144 | orchestrator | 2025-09-27 00:53:56.631149 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2025-09-27 00:53:56.631155 | orchestrator | Saturday 27 September 2025 00:46:45 +0000 (0:00:00.272) 0:03:12.930 **** 2025-09-27 00:53:56.631160 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.631165 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.631174 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.631180 | orchestrator | 2025-09-27 00:53:56.631185 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2025-09-27 00:53:56.631191 | orchestrator | Saturday 27 September 2025 00:46:46 +0000 (0:00:01.176) 0:03:14.106 **** 2025-09-27 00:53:56.631196 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.631201 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.631207 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.631212 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.631217 | orchestrator | 2025-09-27 00:53:56.631223 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2025-09-27 00:53:56.631228 | orchestrator | Saturday 27 September 2025 00:46:47 +0000 (0:00:00.715) 0:03:14.822 **** 2025-09-27 00:53:56.631234 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.631239 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.631244 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.631250 | orchestrator | 2025-09-27 00:53:56.631258 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2025-09-27 00:53:56.631263 | orchestrator | Saturday 27 September 2025 00:46:47 +0000 (0:00:00.294) 0:03:15.116 **** 2025-09-27 00:53:56.631269 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631274 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.631279 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.631285 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.631290 | orchestrator | 2025-09-27 00:53:56.631296 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2025-09-27 00:53:56.631301 | orchestrator | Saturday 27 September 2025 00:46:48 +0000 (0:00:00.864) 0:03:15.981 **** 2025-09-27 00:53:56.631306 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.631312 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.631317 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.631322 | orchestrator | 2025-09-27 00:53:56.631328 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2025-09-27 00:53:56.631333 | orchestrator | Saturday 27 September 2025 00:46:48 +0000 (0:00:00.255) 0:03:16.236 **** 2025-09-27 00:53:56.631339 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.631344 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.631349 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.631355 | orchestrator | 2025-09-27 00:53:56.631360 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2025-09-27 00:53:56.631365 | orchestrator | Saturday 27 September 2025 00:46:50 +0000 (0:00:01.493) 0:03:17.730 **** 2025-09-27 00:53:56.631371 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.631376 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.631382 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.631387 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.631392 | orchestrator | 2025-09-27 00:53:56.631398 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2025-09-27 00:53:56.631403 | orchestrator | Saturday 27 September 2025 00:46:50 +0000 (0:00:00.636) 0:03:18.366 **** 2025-09-27 00:53:56.631408 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.631414 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.631419 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.631424 | orchestrator | 2025-09-27 00:53:56.631430 | orchestrator | RUNNING HANDLER [ceph-handler : Rbdmirrors handler] **************************** 2025-09-27 00:53:56.631435 | orchestrator | Saturday 27 September 2025 00:46:51 +0000 (0:00:00.295) 0:03:18.662 **** 2025-09-27 00:53:56.631441 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631446 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.631451 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.631463 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.631469 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.631474 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.631480 | orchestrator | 2025-09-27 00:53:56.631485 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2025-09-27 00:53:56.631490 | orchestrator | Saturday 27 September 2025 00:46:51 +0000 (0:00:00.480) 0:03:19.142 **** 2025-09-27 00:53:56.631499 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.631504 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.631510 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.631515 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.631520 | orchestrator | 2025-09-27 00:53:56.631526 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2025-09-27 00:53:56.631531 | orchestrator | Saturday 27 September 2025 00:46:52 +0000 (0:00:00.848) 0:03:19.991 **** 2025-09-27 00:53:56.631536 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.631542 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.631547 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.631553 | orchestrator | 2025-09-27 00:53:56.631558 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2025-09-27 00:53:56.631563 | orchestrator | Saturday 27 September 2025 00:46:52 +0000 (0:00:00.280) 0:03:20.271 **** 2025-09-27 00:53:56.631569 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.631574 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.631579 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.631585 | orchestrator | 2025-09-27 00:53:56.631590 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2025-09-27 00:53:56.631596 | orchestrator | Saturday 27 September 2025 00:46:54 +0000 (0:00:01.807) 0:03:22.079 **** 2025-09-27 00:53:56.631601 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-27 00:53:56.631606 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-27 00:53:56.631612 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-27 00:53:56.631617 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631622 | orchestrator | 2025-09-27 00:53:56.631628 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2025-09-27 00:53:56.631633 | orchestrator | Saturday 27 September 2025 00:46:55 +0000 (0:00:00.649) 0:03:22.728 **** 2025-09-27 00:53:56.631638 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.631644 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.631649 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.631655 | orchestrator | 2025-09-27 00:53:56.631660 | orchestrator | PLAY [Apply role ceph-mon] ***************************************************** 2025-09-27 00:53:56.631665 | orchestrator | 2025-09-27 00:53:56.631671 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-27 00:53:56.631676 | orchestrator | Saturday 27 September 2025 00:46:55 +0000 (0:00:00.545) 0:03:23.274 **** 2025-09-27 00:53:56.631682 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.631687 | orchestrator | 2025-09-27 00:53:56.631693 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-27 00:53:56.631698 | orchestrator | Saturday 27 September 2025 00:46:56 +0000 (0:00:00.627) 0:03:23.901 **** 2025-09-27 00:53:56.631707 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.631712 | orchestrator | 2025-09-27 00:53:56.631718 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-27 00:53:56.631723 | orchestrator | Saturday 27 September 2025 00:46:56 +0000 (0:00:00.591) 0:03:24.493 **** 2025-09-27 00:53:56.631728 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.631734 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.631739 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.631749 | orchestrator | 2025-09-27 00:53:56.631754 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-27 00:53:56.631760 | orchestrator | Saturday 27 September 2025 00:46:57 +0000 (0:00:00.755) 0:03:25.249 **** 2025-09-27 00:53:56.631765 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631770 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.631776 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.631781 | orchestrator | 2025-09-27 00:53:56.631787 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-27 00:53:56.631792 | orchestrator | Saturday 27 September 2025 00:46:58 +0000 (0:00:00.664) 0:03:25.913 **** 2025-09-27 00:53:56.631797 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631803 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.631808 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.631813 | orchestrator | 2025-09-27 00:53:56.631819 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-27 00:53:56.631824 | orchestrator | Saturday 27 September 2025 00:46:58 +0000 (0:00:00.543) 0:03:26.456 **** 2025-09-27 00:53:56.631830 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631835 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.631840 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.631846 | orchestrator | 2025-09-27 00:53:56.631851 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-27 00:53:56.631856 | orchestrator | Saturday 27 September 2025 00:46:59 +0000 (0:00:00.481) 0:03:26.938 **** 2025-09-27 00:53:56.631862 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.631867 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.631873 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.631878 | orchestrator | 2025-09-27 00:53:56.631883 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-27 00:53:56.631889 | orchestrator | Saturday 27 September 2025 00:47:00 +0000 (0:00:00.860) 0:03:27.798 **** 2025-09-27 00:53:56.631894 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631900 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.631905 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.631910 | orchestrator | 2025-09-27 00:53:56.631916 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-27 00:53:56.631921 | orchestrator | Saturday 27 September 2025 00:47:00 +0000 (0:00:00.487) 0:03:28.286 **** 2025-09-27 00:53:56.631926 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.631932 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.631937 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.631942 | orchestrator | 2025-09-27 00:53:56.631948 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-27 00:53:56.631956 | orchestrator | Saturday 27 September 2025 00:47:00 +0000 (0:00:00.320) 0:03:28.607 **** 2025-09-27 00:53:56.631961 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.631967 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.631972 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.631978 | orchestrator | 2025-09-27 00:53:56.631983 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-27 00:53:56.631988 | orchestrator | Saturday 27 September 2025 00:47:01 +0000 (0:00:00.952) 0:03:29.559 **** 2025-09-27 00:53:56.631994 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.631999 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632005 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632010 | orchestrator | 2025-09-27 00:53:56.632015 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-27 00:53:56.632021 | orchestrator | Saturday 27 September 2025 00:47:02 +0000 (0:00:00.755) 0:03:30.315 **** 2025-09-27 00:53:56.632026 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.632032 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.632037 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.632042 | orchestrator | 2025-09-27 00:53:56.632047 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-27 00:53:56.632057 | orchestrator | Saturday 27 September 2025 00:47:03 +0000 (0:00:00.424) 0:03:30.740 **** 2025-09-27 00:53:56.632062 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632067 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632073 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632078 | orchestrator | 2025-09-27 00:53:56.632083 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-27 00:53:56.632100 | orchestrator | Saturday 27 September 2025 00:47:03 +0000 (0:00:00.258) 0:03:30.999 **** 2025-09-27 00:53:56.632106 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.632111 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.632116 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.632122 | orchestrator | 2025-09-27 00:53:56.632127 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-27 00:53:56.632132 | orchestrator | Saturday 27 September 2025 00:47:03 +0000 (0:00:00.253) 0:03:31.252 **** 2025-09-27 00:53:56.632138 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.632143 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.632149 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.632154 | orchestrator | 2025-09-27 00:53:56.632159 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-27 00:53:56.632165 | orchestrator | Saturday 27 September 2025 00:47:03 +0000 (0:00:00.266) 0:03:31.519 **** 2025-09-27 00:53:56.632170 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.632176 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.632181 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.632186 | orchestrator | 2025-09-27 00:53:56.632192 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-27 00:53:56.632197 | orchestrator | Saturday 27 September 2025 00:47:04 +0000 (0:00:00.266) 0:03:31.785 **** 2025-09-27 00:53:56.632202 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.632208 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.632216 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.632221 | orchestrator | 2025-09-27 00:53:56.632227 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-27 00:53:56.632232 | orchestrator | Saturday 27 September 2025 00:47:04 +0000 (0:00:00.444) 0:03:32.229 **** 2025-09-27 00:53:56.632238 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.632243 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.632249 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.632254 | orchestrator | 2025-09-27 00:53:56.632259 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-27 00:53:56.632265 | orchestrator | Saturday 27 September 2025 00:47:04 +0000 (0:00:00.309) 0:03:32.539 **** 2025-09-27 00:53:56.632270 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632275 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632281 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632286 | orchestrator | 2025-09-27 00:53:56.632292 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-27 00:53:56.632297 | orchestrator | Saturday 27 September 2025 00:47:05 +0000 (0:00:00.359) 0:03:32.899 **** 2025-09-27 00:53:56.632302 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632308 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632313 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632318 | orchestrator | 2025-09-27 00:53:56.632324 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-27 00:53:56.632329 | orchestrator | Saturday 27 September 2025 00:47:05 +0000 (0:00:00.500) 0:03:33.399 **** 2025-09-27 00:53:56.632335 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632340 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632345 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632351 | orchestrator | 2025-09-27 00:53:56.632356 | orchestrator | TASK [ceph-mon : Set_fact container_exec_cmd] ********************************** 2025-09-27 00:53:56.632362 | orchestrator | Saturday 27 September 2025 00:47:06 +0000 (0:00:00.746) 0:03:34.146 **** 2025-09-27 00:53:56.632371 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632376 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632381 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632387 | orchestrator | 2025-09-27 00:53:56.632392 | orchestrator | TASK [ceph-mon : Include deploy_monitors.yml] ********************************** 2025-09-27 00:53:56.632398 | orchestrator | Saturday 27 September 2025 00:47:06 +0000 (0:00:00.284) 0:03:34.430 **** 2025-09-27 00:53:56.632403 | orchestrator | included: /ansible/roles/ceph-mon/tasks/deploy_monitors.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.632409 | orchestrator | 2025-09-27 00:53:56.632414 | orchestrator | TASK [ceph-mon : Check if monitor initial keyring already exists] ************** 2025-09-27 00:53:56.632419 | orchestrator | Saturday 27 September 2025 00:47:07 +0000 (0:00:00.568) 0:03:34.999 **** 2025-09-27 00:53:56.632425 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.632430 | orchestrator | 2025-09-27 00:53:56.632436 | orchestrator | TASK [ceph-mon : Generate monitor initial keyring] ***************************** 2025-09-27 00:53:56.632441 | orchestrator | Saturday 27 September 2025 00:47:07 +0000 (0:00:00.128) 0:03:35.128 **** 2025-09-27 00:53:56.632446 | orchestrator | changed: [testbed-node-0 -> localhost] 2025-09-27 00:53:56.632452 | orchestrator | 2025-09-27 00:53:56.632460 | orchestrator | TASK [ceph-mon : Set_fact _initial_mon_key_success] **************************** 2025-09-27 00:53:56.632465 | orchestrator | Saturday 27 September 2025 00:47:08 +0000 (0:00:00.809) 0:03:35.938 **** 2025-09-27 00:53:56.632471 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632476 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632481 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632487 | orchestrator | 2025-09-27 00:53:56.632492 | orchestrator | TASK [ceph-mon : Get initial keyring when it already exists] ******************* 2025-09-27 00:53:56.632497 | orchestrator | Saturday 27 September 2025 00:47:08 +0000 (0:00:00.272) 0:03:36.210 **** 2025-09-27 00:53:56.632503 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632508 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632514 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632519 | orchestrator | 2025-09-27 00:53:56.632524 | orchestrator | TASK [ceph-mon : Create monitor initial keyring] ******************************* 2025-09-27 00:53:56.632530 | orchestrator | Saturday 27 September 2025 00:47:09 +0000 (0:00:00.437) 0:03:36.647 **** 2025-09-27 00:53:56.632535 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.632541 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.632546 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.632551 | orchestrator | 2025-09-27 00:53:56.632557 | orchestrator | TASK [ceph-mon : Copy the initial key in /etc/ceph (for containers)] *********** 2025-09-27 00:53:56.632562 | orchestrator | Saturday 27 September 2025 00:47:10 +0000 (0:00:01.239) 0:03:37.887 **** 2025-09-27 00:53:56.632567 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.632573 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.632578 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.632583 | orchestrator | 2025-09-27 00:53:56.632589 | orchestrator | TASK [ceph-mon : Create monitor directory] ************************************* 2025-09-27 00:53:56.632594 | orchestrator | Saturday 27 September 2025 00:47:11 +0000 (0:00:00.982) 0:03:38.869 **** 2025-09-27 00:53:56.632600 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.632605 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.632610 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.632616 | orchestrator | 2025-09-27 00:53:56.632621 | orchestrator | TASK [ceph-mon : Recursively fix ownership of monitor directory] *************** 2025-09-27 00:53:56.632627 | orchestrator | Saturday 27 September 2025 00:47:12 +0000 (0:00:00.776) 0:03:39.645 **** 2025-09-27 00:53:56.632632 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632637 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632643 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632648 | orchestrator | 2025-09-27 00:53:56.632654 | orchestrator | TASK [ceph-mon : Create admin keyring] ***************************************** 2025-09-27 00:53:56.632662 | orchestrator | Saturday 27 September 2025 00:47:12 +0000 (0:00:00.837) 0:03:40.483 **** 2025-09-27 00:53:56.632668 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.632673 | orchestrator | 2025-09-27 00:53:56.632679 | orchestrator | TASK [ceph-mon : Slurp admin keyring] ****************************************** 2025-09-27 00:53:56.632684 | orchestrator | Saturday 27 September 2025 00:47:14 +0000 (0:00:01.164) 0:03:41.648 **** 2025-09-27 00:53:56.632689 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632695 | orchestrator | 2025-09-27 00:53:56.632703 | orchestrator | TASK [ceph-mon : Copy admin keyring over to mons] ****************************** 2025-09-27 00:53:56.632708 | orchestrator | Saturday 27 September 2025 00:47:14 +0000 (0:00:00.641) 0:03:42.289 **** 2025-09-27 00:53:56.632714 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-27 00:53:56.632719 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.632725 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.632730 | orchestrator | changed: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-27 00:53:56.632736 | orchestrator | ok: [testbed-node-1] => (item=None) 2025-09-27 00:53:56.632741 | orchestrator | ok: [testbed-node-2 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-27 00:53:56.632746 | orchestrator | changed: [testbed-node-1 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-27 00:53:56.632752 | orchestrator | changed: [testbed-node-1 -> {{ item }}] 2025-09-27 00:53:56.632757 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-27 00:53:56.632762 | orchestrator | changed: [testbed-node-0 -> {{ item }}] 2025-09-27 00:53:56.632768 | orchestrator | ok: [testbed-node-2] => (item=None) 2025-09-27 00:53:56.632773 | orchestrator | ok: [testbed-node-2 -> {{ item }}] 2025-09-27 00:53:56.632779 | orchestrator | 2025-09-27 00:53:56.632784 | orchestrator | TASK [ceph-mon : Import admin keyring into mon keyring] ************************ 2025-09-27 00:53:56.632789 | orchestrator | Saturday 27 September 2025 00:47:18 +0000 (0:00:03.618) 0:03:45.908 **** 2025-09-27 00:53:56.632795 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.632800 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.632805 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.632811 | orchestrator | 2025-09-27 00:53:56.632816 | orchestrator | TASK [ceph-mon : Set_fact ceph-mon container command] ************************** 2025-09-27 00:53:56.632821 | orchestrator | Saturday 27 September 2025 00:47:19 +0000 (0:00:01.194) 0:03:47.102 **** 2025-09-27 00:53:56.632827 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632832 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632838 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632843 | orchestrator | 2025-09-27 00:53:56.632848 | orchestrator | TASK [ceph-mon : Set_fact monmaptool container command] ************************ 2025-09-27 00:53:56.632854 | orchestrator | Saturday 27 September 2025 00:47:19 +0000 (0:00:00.296) 0:03:47.399 **** 2025-09-27 00:53:56.632859 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.632864 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.632870 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.632875 | orchestrator | 2025-09-27 00:53:56.632881 | orchestrator | TASK [ceph-mon : Generate initial monmap] ************************************** 2025-09-27 00:53:56.632886 | orchestrator | Saturday 27 September 2025 00:47:20 +0000 (0:00:00.303) 0:03:47.703 **** 2025-09-27 00:53:56.632891 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.632897 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.632902 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.632908 | orchestrator | 2025-09-27 00:53:56.632913 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs with keyring] ******************************* 2025-09-27 00:53:56.632921 | orchestrator | Saturday 27 September 2025 00:47:22 +0000 (0:00:02.065) 0:03:49.768 **** 2025-09-27 00:53:56.632926 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.632932 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.632937 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.632942 | orchestrator | 2025-09-27 00:53:56.632951 | orchestrator | TASK [ceph-mon : Ceph monitor mkfs without keyring] **************************** 2025-09-27 00:53:56.632957 | orchestrator | Saturday 27 September 2025 00:47:23 +0000 (0:00:01.238) 0:03:51.007 **** 2025-09-27 00:53:56.632962 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.632967 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.632973 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.632978 | orchestrator | 2025-09-27 00:53:56.632983 | orchestrator | TASK [ceph-mon : Include start_monitor.yml] ************************************ 2025-09-27 00:53:56.632989 | orchestrator | Saturday 27 September 2025 00:47:23 +0000 (0:00:00.313) 0:03:51.320 **** 2025-09-27 00:53:56.632994 | orchestrator | included: /ansible/roles/ceph-mon/tasks/start_monitor.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.632999 | orchestrator | 2025-09-27 00:53:56.633005 | orchestrator | TASK [ceph-mon : Ensure systemd service override directory exists] ************* 2025-09-27 00:53:56.633010 | orchestrator | Saturday 27 September 2025 00:47:24 +0000 (0:00:00.504) 0:03:51.824 **** 2025-09-27 00:53:56.633015 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633021 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633026 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633031 | orchestrator | 2025-09-27 00:53:56.633037 | orchestrator | TASK [ceph-mon : Add ceph-mon systemd service overrides] *********************** 2025-09-27 00:53:56.633042 | orchestrator | Saturday 27 September 2025 00:47:24 +0000 (0:00:00.403) 0:03:52.228 **** 2025-09-27 00:53:56.633047 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633053 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633058 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633063 | orchestrator | 2025-09-27 00:53:56.633069 | orchestrator | TASK [ceph-mon : Include_tasks systemd.yml] ************************************ 2025-09-27 00:53:56.633074 | orchestrator | Saturday 27 September 2025 00:47:24 +0000 (0:00:00.260) 0:03:52.488 **** 2025-09-27 00:53:56.633080 | orchestrator | included: /ansible/roles/ceph-mon/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.633096 | orchestrator | 2025-09-27 00:53:56.633102 | orchestrator | TASK [ceph-mon : Generate systemd unit file for mon container] ***************** 2025-09-27 00:53:56.633108 | orchestrator | Saturday 27 September 2025 00:47:25 +0000 (0:00:00.476) 0:03:52.965 **** 2025-09-27 00:53:56.633113 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.633118 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.633124 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.633129 | orchestrator | 2025-09-27 00:53:56.633134 | orchestrator | TASK [ceph-mon : Generate systemd ceph-mon target file] ************************ 2025-09-27 00:53:56.633139 | orchestrator | Saturday 27 September 2025 00:47:26 +0000 (0:00:01.548) 0:03:54.514 **** 2025-09-27 00:53:56.633149 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.633155 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.633160 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.633166 | orchestrator | 2025-09-27 00:53:56.633171 | orchestrator | TASK [ceph-mon : Enable ceph-mon.target] *************************************** 2025-09-27 00:53:56.633176 | orchestrator | Saturday 27 September 2025 00:47:27 +0000 (0:00:01.108) 0:03:55.622 **** 2025-09-27 00:53:56.633182 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.633187 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.633192 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.633198 | orchestrator | 2025-09-27 00:53:56.633203 | orchestrator | TASK [ceph-mon : Start the monitor service] ************************************ 2025-09-27 00:53:56.633208 | orchestrator | Saturday 27 September 2025 00:47:29 +0000 (0:00:01.673) 0:03:57.295 **** 2025-09-27 00:53:56.633214 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.633219 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.633224 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.633230 | orchestrator | 2025-09-27 00:53:56.633235 | orchestrator | TASK [ceph-mon : Include_tasks ceph_keys.yml] ********************************** 2025-09-27 00:53:56.633241 | orchestrator | Saturday 27 September 2025 00:47:31 +0000 (0:00:01.877) 0:03:59.173 **** 2025-09-27 00:53:56.633250 | orchestrator | included: /ansible/roles/ceph-mon/tasks/ceph_keys.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.633255 | orchestrator | 2025-09-27 00:53:56.633261 | orchestrator | TASK [ceph-mon : Waiting for the monitor(s) to form the quorum...] ************* 2025-09-27 00:53:56.633266 | orchestrator | Saturday 27 September 2025 00:47:32 +0000 (0:00:00.645) 0:03:59.818 **** 2025-09-27 00:53:56.633271 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.633277 | orchestrator | 2025-09-27 00:53:56.633282 | orchestrator | TASK [ceph-mon : Fetch ceph initial keys] ************************************** 2025-09-27 00:53:56.633287 | orchestrator | Saturday 27 September 2025 00:47:33 +0000 (0:00:01.265) 0:04:01.084 **** 2025-09-27 00:53:56.633293 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.633298 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.633303 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.633309 | orchestrator | 2025-09-27 00:53:56.633314 | orchestrator | TASK [ceph-mon : Include secure_cluster.yml] *********************************** 2025-09-27 00:53:56.633319 | orchestrator | Saturday 27 September 2025 00:47:42 +0000 (0:00:09.072) 0:04:10.156 **** 2025-09-27 00:53:56.633325 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633330 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633335 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633341 | orchestrator | 2025-09-27 00:53:56.633346 | orchestrator | TASK [ceph-mon : Set cluster configs] ****************************************** 2025-09-27 00:53:56.633351 | orchestrator | Saturday 27 September 2025 00:47:42 +0000 (0:00:00.268) 0:04:10.425 **** 2025-09-27 00:53:56.633361 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__96a77dfc6fdabae22824d74affee25ac39642e2f'}}, {'key': 'public_network', 'value': '192.168.16.0/20'}]) 2025-09-27 00:53:56.633368 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__96a77dfc6fdabae22824d74affee25ac39642e2f'}}, {'key': 'cluster_network', 'value': '192.168.16.0/20'}]) 2025-09-27 00:53:56.633374 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__96a77dfc6fdabae22824d74affee25ac39642e2f'}}, {'key': 'osd_pool_default_crush_rule', 'value': -1}]) 2025-09-27 00:53:56.633380 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__96a77dfc6fdabae22824d74affee25ac39642e2f'}}, {'key': 'ms_bind_ipv6', 'value': 'False'}]) 2025-09-27 00:53:56.633387 | orchestrator | changed: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__96a77dfc6fdabae22824d74affee25ac39642e2f'}}, {'key': 'ms_bind_ipv4', 'value': 'True'}]) 2025-09-27 00:53:56.633396 | orchestrator | skipping: [testbed-node-0] => (item=[{'key': 'global', 'value': {'public_network': '192.168.16.0/20', 'cluster_network': '192.168.16.0/20', 'osd_pool_default_crush_rule': -1, 'ms_bind_ipv6': 'False', 'ms_bind_ipv4': 'True', 'osd_crush_chooseleaf_type': '__omit_place_holder__96a77dfc6fdabae22824d74affee25ac39642e2f'}}, {'key': 'osd_crush_chooseleaf_type', 'value': '__omit_place_holder__96a77dfc6fdabae22824d74affee25ac39642e2f'}])  2025-09-27 00:53:56.633406 | orchestrator | 2025-09-27 00:53:56.633412 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-27 00:53:56.633417 | orchestrator | Saturday 27 September 2025 00:47:58 +0000 (0:00:15.253) 0:04:25.678 **** 2025-09-27 00:53:56.633423 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633428 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633433 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633439 | orchestrator | 2025-09-27 00:53:56.633444 | orchestrator | RUNNING HANDLER [ceph-handler : Mons handler] ********************************** 2025-09-27 00:53:56.633449 | orchestrator | Saturday 27 September 2025 00:47:58 +0000 (0:00:00.307) 0:04:25.986 **** 2025-09-27 00:53:56.633455 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mons.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.633460 | orchestrator | 2025-09-27 00:53:56.633465 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called before restart] ******** 2025-09-27 00:53:56.633471 | orchestrator | Saturday 27 September 2025 00:47:58 +0000 (0:00:00.566) 0:04:26.553 **** 2025-09-27 00:53:56.633476 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.633482 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.633487 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.633492 | orchestrator | 2025-09-27 00:53:56.633498 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mon restart script] *********************** 2025-09-27 00:53:56.633503 | orchestrator | Saturday 27 September 2025 00:47:59 +0000 (0:00:00.270) 0:04:26.824 **** 2025-09-27 00:53:56.633508 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633514 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633519 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633524 | orchestrator | 2025-09-27 00:53:56.633529 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mon daemon(s)] ******************** 2025-09-27 00:53:56.633535 | orchestrator | Saturday 27 September 2025 00:47:59 +0000 (0:00:00.285) 0:04:27.109 **** 2025-09-27 00:53:56.633540 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-27 00:53:56.633546 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-27 00:53:56.633551 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-27 00:53:56.633556 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633562 | orchestrator | 2025-09-27 00:53:56.633567 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mon_handler_called after restart] ********* 2025-09-27 00:53:56.633573 | orchestrator | Saturday 27 September 2025 00:48:00 +0000 (0:00:00.680) 0:04:27.789 **** 2025-09-27 00:53:56.633578 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.633583 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.633589 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.633594 | orchestrator | 2025-09-27 00:53:56.633599 | orchestrator | PLAY [Apply role ceph-mgr] ***************************************************** 2025-09-27 00:53:56.633605 | orchestrator | 2025-09-27 00:53:56.633610 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-27 00:53:56.633616 | orchestrator | Saturday 27 September 2025 00:48:00 +0000 (0:00:00.633) 0:04:28.423 **** 2025-09-27 00:53:56.633624 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.633629 | orchestrator | 2025-09-27 00:53:56.633635 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-27 00:53:56.633640 | orchestrator | Saturday 27 September 2025 00:48:01 +0000 (0:00:00.454) 0:04:28.877 **** 2025-09-27 00:53:56.633646 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.633651 | orchestrator | 2025-09-27 00:53:56.633657 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-27 00:53:56.633662 | orchestrator | Saturday 27 September 2025 00:48:01 +0000 (0:00:00.553) 0:04:29.431 **** 2025-09-27 00:53:56.633667 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.633676 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.633682 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.633687 | orchestrator | 2025-09-27 00:53:56.633693 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-27 00:53:56.633698 | orchestrator | Saturday 27 September 2025 00:48:02 +0000 (0:00:00.710) 0:04:30.141 **** 2025-09-27 00:53:56.633704 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633709 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633714 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633720 | orchestrator | 2025-09-27 00:53:56.633725 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-27 00:53:56.633731 | orchestrator | Saturday 27 September 2025 00:48:02 +0000 (0:00:00.275) 0:04:30.417 **** 2025-09-27 00:53:56.633736 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633742 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633747 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633752 | orchestrator | 2025-09-27 00:53:56.633757 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-27 00:53:56.633763 | orchestrator | Saturday 27 September 2025 00:48:03 +0000 (0:00:00.261) 0:04:30.679 **** 2025-09-27 00:53:56.633768 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633774 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633779 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633784 | orchestrator | 2025-09-27 00:53:56.633789 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-27 00:53:56.633795 | orchestrator | Saturday 27 September 2025 00:48:03 +0000 (0:00:00.428) 0:04:31.108 **** 2025-09-27 00:53:56.633800 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.633806 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.633811 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.633816 | orchestrator | 2025-09-27 00:53:56.633822 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-27 00:53:56.633827 | orchestrator | Saturday 27 September 2025 00:48:04 +0000 (0:00:00.645) 0:04:31.754 **** 2025-09-27 00:53:56.633833 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633838 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633846 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633852 | orchestrator | 2025-09-27 00:53:56.633857 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-27 00:53:56.633863 | orchestrator | Saturday 27 September 2025 00:48:04 +0000 (0:00:00.263) 0:04:32.017 **** 2025-09-27 00:53:56.633868 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633873 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633879 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633884 | orchestrator | 2025-09-27 00:53:56.633889 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-27 00:53:56.633895 | orchestrator | Saturday 27 September 2025 00:48:04 +0000 (0:00:00.266) 0:04:32.283 **** 2025-09-27 00:53:56.633900 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.633905 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.633911 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.633916 | orchestrator | 2025-09-27 00:53:56.633922 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-27 00:53:56.633927 | orchestrator | Saturday 27 September 2025 00:48:05 +0000 (0:00:00.851) 0:04:33.135 **** 2025-09-27 00:53:56.633932 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.633938 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.633943 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.633948 | orchestrator | 2025-09-27 00:53:56.633954 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-27 00:53:56.633959 | orchestrator | Saturday 27 September 2025 00:48:06 +0000 (0:00:00.655) 0:04:33.790 **** 2025-09-27 00:53:56.633965 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.633970 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.633979 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.633984 | orchestrator | 2025-09-27 00:53:56.633989 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-27 00:53:56.633995 | orchestrator | Saturday 27 September 2025 00:48:06 +0000 (0:00:00.316) 0:04:34.107 **** 2025-09-27 00:53:56.634000 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.634006 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.634011 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.634034 | orchestrator | 2025-09-27 00:53:56.634039 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-27 00:53:56.634045 | orchestrator | Saturday 27 September 2025 00:48:06 +0000 (0:00:00.315) 0:04:34.422 **** 2025-09-27 00:53:56.634050 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634055 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634061 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634066 | orchestrator | 2025-09-27 00:53:56.634072 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-27 00:53:56.634077 | orchestrator | Saturday 27 September 2025 00:48:07 +0000 (0:00:00.321) 0:04:34.744 **** 2025-09-27 00:53:56.634082 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634115 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634121 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634126 | orchestrator | 2025-09-27 00:53:56.634132 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-27 00:53:56.634137 | orchestrator | Saturday 27 September 2025 00:48:07 +0000 (0:00:00.532) 0:04:35.276 **** 2025-09-27 00:53:56.634147 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634153 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634158 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634163 | orchestrator | 2025-09-27 00:53:56.634169 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-27 00:53:56.634174 | orchestrator | Saturday 27 September 2025 00:48:07 +0000 (0:00:00.287) 0:04:35.564 **** 2025-09-27 00:53:56.634180 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634185 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634190 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634196 | orchestrator | 2025-09-27 00:53:56.634201 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-27 00:53:56.634207 | orchestrator | Saturday 27 September 2025 00:48:08 +0000 (0:00:00.301) 0:04:35.866 **** 2025-09-27 00:53:56.634212 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634217 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634223 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634228 | orchestrator | 2025-09-27 00:53:56.634233 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-27 00:53:56.634239 | orchestrator | Saturday 27 September 2025 00:48:08 +0000 (0:00:00.286) 0:04:36.152 **** 2025-09-27 00:53:56.634244 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.634250 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.634255 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.634260 | orchestrator | 2025-09-27 00:53:56.634266 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-27 00:53:56.634271 | orchestrator | Saturday 27 September 2025 00:48:09 +0000 (0:00:00.537) 0:04:36.690 **** 2025-09-27 00:53:56.634277 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.634282 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.634287 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.634293 | orchestrator | 2025-09-27 00:53:56.634298 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-27 00:53:56.634304 | orchestrator | Saturday 27 September 2025 00:48:09 +0000 (0:00:00.323) 0:04:37.013 **** 2025-09-27 00:53:56.634309 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.634314 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.634320 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.634325 | orchestrator | 2025-09-27 00:53:56.634331 | orchestrator | TASK [ceph-mgr : Set_fact container_exec_cmd] ********************************** 2025-09-27 00:53:56.634340 | orchestrator | Saturday 27 September 2025 00:48:09 +0000 (0:00:00.524) 0:04:37.538 **** 2025-09-27 00:53:56.634346 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-27 00:53:56.634351 | orchestrator | ok: [testbed-node-0 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:53:56.634357 | orchestrator | ok: [testbed-node-0 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:53:56.634362 | orchestrator | 2025-09-27 00:53:56.634367 | orchestrator | TASK [ceph-mgr : Include common.yml] ******************************************* 2025-09-27 00:53:56.634373 | orchestrator | Saturday 27 September 2025 00:48:10 +0000 (0:00:00.892) 0:04:38.431 **** 2025-09-27 00:53:56.634381 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/common.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.634387 | orchestrator | 2025-09-27 00:53:56.634393 | orchestrator | TASK [ceph-mgr : Create mgr directory] ***************************************** 2025-09-27 00:53:56.634398 | orchestrator | Saturday 27 September 2025 00:48:11 +0000 (0:00:00.728) 0:04:39.160 **** 2025-09-27 00:53:56.634403 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.634409 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.634414 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.634420 | orchestrator | 2025-09-27 00:53:56.634425 | orchestrator | TASK [ceph-mgr : Fetch ceph mgr keyring] *************************************** 2025-09-27 00:53:56.634430 | orchestrator | Saturday 27 September 2025 00:48:12 +0000 (0:00:00.678) 0:04:39.839 **** 2025-09-27 00:53:56.634436 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634441 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634447 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634452 | orchestrator | 2025-09-27 00:53:56.634457 | orchestrator | TASK [ceph-mgr : Create ceph mgr keyring(s) on a mon node] ********************* 2025-09-27 00:53:56.634463 | orchestrator | Saturday 27 September 2025 00:48:12 +0000 (0:00:00.331) 0:04:40.170 **** 2025-09-27 00:53:56.634468 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-27 00:53:56.634474 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-27 00:53:56.634479 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-27 00:53:56.634484 | orchestrator | changed: [testbed-node-0 -> {{ groups[mon_group_name][0] }}] 2025-09-27 00:53:56.634490 | orchestrator | 2025-09-27 00:53:56.634495 | orchestrator | TASK [ceph-mgr : Set_fact _mgr_keys] ******************************************* 2025-09-27 00:53:56.634500 | orchestrator | Saturday 27 September 2025 00:48:23 +0000 (0:00:10.928) 0:04:51.098 **** 2025-09-27 00:53:56.634506 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.634511 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.634517 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.634522 | orchestrator | 2025-09-27 00:53:56.634527 | orchestrator | TASK [ceph-mgr : Get keys from monitors] *************************************** 2025-09-27 00:53:56.634533 | orchestrator | Saturday 27 September 2025 00:48:24 +0000 (0:00:00.655) 0:04:51.754 **** 2025-09-27 00:53:56.634538 | orchestrator | skipping: [testbed-node-1] => (item=None)  2025-09-27 00:53:56.634543 | orchestrator | skipping: [testbed-node-0] => (item=None)  2025-09-27 00:53:56.634549 | orchestrator | skipping: [testbed-node-2] => (item=None)  2025-09-27 00:53:56.634554 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.634560 | orchestrator | ok: [testbed-node-1 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.634565 | orchestrator | ok: [testbed-node-0] => (item=None) 2025-09-27 00:53:56.634570 | orchestrator | 2025-09-27 00:53:56.634576 | orchestrator | TASK [ceph-mgr : Copy ceph key(s) if needed] *********************************** 2025-09-27 00:53:56.634581 | orchestrator | Saturday 27 September 2025 00:48:26 +0000 (0:00:02.379) 0:04:54.133 **** 2025-09-27 00:53:56.634587 | orchestrator | skipping: [testbed-node-0] => (item=None)  2025-09-27 00:53:56.634592 | orchestrator | skipping: [testbed-node-1] => (item=None)  2025-09-27 00:53:56.634601 | orchestrator | skipping: [testbed-node-2] => (item=None)  2025-09-27 00:53:56.634610 | orchestrator | changed: [testbed-node-1] => (item=None) 2025-09-27 00:53:56.634615 | orchestrator | changed: [testbed-node-0] => (item=None) 2025-09-27 00:53:56.634621 | orchestrator | changed: [testbed-node-2] => (item=None) 2025-09-27 00:53:56.634626 | orchestrator | 2025-09-27 00:53:56.634631 | orchestrator | TASK [ceph-mgr : Set mgr key permissions] ************************************** 2025-09-27 00:53:56.634637 | orchestrator | Saturday 27 September 2025 00:48:27 +0000 (0:00:01.270) 0:04:55.403 **** 2025-09-27 00:53:56.634642 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.634647 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.634652 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.634656 | orchestrator | 2025-09-27 00:53:56.634661 | orchestrator | TASK [ceph-mgr : Append dashboard modules to ceph_mgr_modules] ***************** 2025-09-27 00:53:56.634666 | orchestrator | Saturday 27 September 2025 00:48:28 +0000 (0:00:00.698) 0:04:56.102 **** 2025-09-27 00:53:56.634671 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634676 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634680 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634685 | orchestrator | 2025-09-27 00:53:56.634690 | orchestrator | TASK [ceph-mgr : Include pre_requisite.yml] ************************************ 2025-09-27 00:53:56.634695 | orchestrator | Saturday 27 September 2025 00:48:28 +0000 (0:00:00.536) 0:04:56.639 **** 2025-09-27 00:53:56.634700 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634704 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634709 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634714 | orchestrator | 2025-09-27 00:53:56.634719 | orchestrator | TASK [ceph-mgr : Include start_mgr.yml] **************************************** 2025-09-27 00:53:56.634723 | orchestrator | Saturday 27 September 2025 00:48:29 +0000 (0:00:00.317) 0:04:56.957 **** 2025-09-27 00:53:56.634728 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/start_mgr.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.634733 | orchestrator | 2025-09-27 00:53:56.634738 | orchestrator | TASK [ceph-mgr : Ensure systemd service override directory exists] ************* 2025-09-27 00:53:56.634743 | orchestrator | Saturday 27 September 2025 00:48:29 +0000 (0:00:00.487) 0:04:57.444 **** 2025-09-27 00:53:56.634747 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634752 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634757 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634762 | orchestrator | 2025-09-27 00:53:56.634766 | orchestrator | TASK [ceph-mgr : Add ceph-mgr systemd service overrides] *********************** 2025-09-27 00:53:56.634771 | orchestrator | Saturday 27 September 2025 00:48:30 +0000 (0:00:00.551) 0:04:57.996 **** 2025-09-27 00:53:56.634776 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634781 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634785 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.634790 | orchestrator | 2025-09-27 00:53:56.634795 | orchestrator | TASK [ceph-mgr : Include_tasks systemd.yml] ************************************ 2025-09-27 00:53:56.634802 | orchestrator | Saturday 27 September 2025 00:48:30 +0000 (0:00:00.347) 0:04:58.344 **** 2025-09-27 00:53:56.634807 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.634812 | orchestrator | 2025-09-27 00:53:56.634817 | orchestrator | TASK [ceph-mgr : Generate systemd unit file] *********************************** 2025-09-27 00:53:56.634822 | orchestrator | Saturday 27 September 2025 00:48:31 +0000 (0:00:00.487) 0:04:58.832 **** 2025-09-27 00:53:56.634827 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.634832 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.634836 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.634841 | orchestrator | 2025-09-27 00:53:56.634846 | orchestrator | TASK [ceph-mgr : Generate systemd ceph-mgr target file] ************************ 2025-09-27 00:53:56.634851 | orchestrator | Saturday 27 September 2025 00:48:32 +0000 (0:00:01.399) 0:05:00.231 **** 2025-09-27 00:53:56.634856 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.634861 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.634869 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.634874 | orchestrator | 2025-09-27 00:53:56.634879 | orchestrator | TASK [ceph-mgr : Enable ceph-mgr.target] *************************************** 2025-09-27 00:53:56.634884 | orchestrator | Saturday 27 September 2025 00:48:33 +0000 (0:00:01.222) 0:05:01.453 **** 2025-09-27 00:53:56.634888 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.634893 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.634898 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.634903 | orchestrator | 2025-09-27 00:53:56.634908 | orchestrator | TASK [ceph-mgr : Systemd start mgr] ******************************************** 2025-09-27 00:53:56.634912 | orchestrator | Saturday 27 September 2025 00:48:35 +0000 (0:00:01.770) 0:05:03.224 **** 2025-09-27 00:53:56.634917 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.634922 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.634927 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.634931 | orchestrator | 2025-09-27 00:53:56.634936 | orchestrator | TASK [ceph-mgr : Include mgr_modules.yml] ************************************** 2025-09-27 00:53:56.634941 | orchestrator | Saturday 27 September 2025 00:48:37 +0000 (0:00:01.946) 0:05:05.170 **** 2025-09-27 00:53:56.634946 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.634950 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.634955 | orchestrator | included: /ansible/roles/ceph-mgr/tasks/mgr_modules.yml for testbed-node-2 2025-09-27 00:53:56.634960 | orchestrator | 2025-09-27 00:53:56.634965 | orchestrator | TASK [ceph-mgr : Wait for all mgr to be up] ************************************ 2025-09-27 00:53:56.634970 | orchestrator | Saturday 27 September 2025 00:48:38 +0000 (0:00:00.628) 0:05:05.799 **** 2025-09-27 00:53:56.634974 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (30 retries left). 2025-09-27 00:53:56.634979 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (29 retries left). 2025-09-27 00:53:56.634984 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (28 retries left). 2025-09-27 00:53:56.634992 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (27 retries left). 2025-09-27 00:53:56.634997 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (26 retries left). 2025-09-27 00:53:56.635002 | orchestrator | FAILED - RETRYING: [testbed-node-2 -> testbed-node-0]: Wait for all mgr to be up (25 retries left). 2025-09-27 00:53:56.635007 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2025-09-27 00:53:56.635011 | orchestrator | 2025-09-27 00:53:56.635016 | orchestrator | TASK [ceph-mgr : Get enabled modules from ceph-mgr] **************************** 2025-09-27 00:53:56.635021 | orchestrator | Saturday 27 September 2025 00:49:14 +0000 (0:00:36.404) 0:05:42.203 **** 2025-09-27 00:53:56.635026 | orchestrator | ok: [testbed-node-2 -> testbed-node-0(192.168.16.10)] 2025-09-27 00:53:56.635031 | orchestrator | 2025-09-27 00:53:56.635035 | orchestrator | TASK [ceph-mgr : Set _ceph_mgr_modules fact (convert _ceph_mgr_modules.stdout to a dict)] *** 2025-09-27 00:53:56.635040 | orchestrator | Saturday 27 September 2025 00:49:15 +0000 (0:00:01.362) 0:05:43.566 **** 2025-09-27 00:53:56.635045 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.635050 | orchestrator | 2025-09-27 00:53:56.635054 | orchestrator | TASK [ceph-mgr : Set _disabled_ceph_mgr_modules fact] ************************** 2025-09-27 00:53:56.635059 | orchestrator | Saturday 27 September 2025 00:49:16 +0000 (0:00:00.300) 0:05:43.866 **** 2025-09-27 00:53:56.635064 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.635069 | orchestrator | 2025-09-27 00:53:56.635074 | orchestrator | TASK [ceph-mgr : Disable ceph mgr enabled modules] ***************************** 2025-09-27 00:53:56.635078 | orchestrator | Saturday 27 September 2025 00:49:16 +0000 (0:00:00.132) 0:05:43.999 **** 2025-09-27 00:53:56.635083 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=iostat) 2025-09-27 00:53:56.635099 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=nfs) 2025-09-27 00:53:56.635107 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=restful) 2025-09-27 00:53:56.635112 | orchestrator | 2025-09-27 00:53:56.635117 | orchestrator | TASK [ceph-mgr : Add modules to ceph-mgr] ************************************** 2025-09-27 00:53:56.635122 | orchestrator | Saturday 27 September 2025 00:49:22 +0000 (0:00:06.451) 0:05:50.450 **** 2025-09-27 00:53:56.635126 | orchestrator | skipping: [testbed-node-2] => (item=balancer)  2025-09-27 00:53:56.635131 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=dashboard) 2025-09-27 00:53:56.635136 | orchestrator | changed: [testbed-node-2 -> testbed-node-0(192.168.16.10)] => (item=prometheus) 2025-09-27 00:53:56.635141 | orchestrator | skipping: [testbed-node-2] => (item=status)  2025-09-27 00:53:56.635146 | orchestrator | 2025-09-27 00:53:56.635150 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-27 00:53:56.635158 | orchestrator | Saturday 27 September 2025 00:49:27 +0000 (0:00:04.846) 0:05:55.297 **** 2025-09-27 00:53:56.635163 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.635168 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.635173 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.635177 | orchestrator | 2025-09-27 00:53:56.635182 | orchestrator | RUNNING HANDLER [ceph-handler : Mgrs handler] ********************************** 2025-09-27 00:53:56.635187 | orchestrator | Saturday 27 September 2025 00:49:28 +0000 (0:00:00.714) 0:05:56.012 **** 2025-09-27 00:53:56.635192 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mgrs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.635197 | orchestrator | 2025-09-27 00:53:56.635202 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called before restart] ******** 2025-09-27 00:53:56.635206 | orchestrator | Saturday 27 September 2025 00:49:28 +0000 (0:00:00.505) 0:05:56.517 **** 2025-09-27 00:53:56.635211 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.635216 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.635221 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.635225 | orchestrator | 2025-09-27 00:53:56.635230 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mgr restart script] *********************** 2025-09-27 00:53:56.635235 | orchestrator | Saturday 27 September 2025 00:49:29 +0000 (0:00:00.301) 0:05:56.819 **** 2025-09-27 00:53:56.635240 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.635245 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.635249 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.635254 | orchestrator | 2025-09-27 00:53:56.635259 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mgr daemon(s)] ******************** 2025-09-27 00:53:56.635264 | orchestrator | Saturday 27 September 2025 00:49:30 +0000 (0:00:01.464) 0:05:58.284 **** 2025-09-27 00:53:56.635269 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-0)  2025-09-27 00:53:56.635273 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-1)  2025-09-27 00:53:56.635278 | orchestrator | skipping: [testbed-node-0] => (item=testbed-node-2)  2025-09-27 00:53:56.635283 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.635288 | orchestrator | 2025-09-27 00:53:56.635292 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mgr_handler_called after restart] ********* 2025-09-27 00:53:56.635297 | orchestrator | Saturday 27 September 2025 00:49:31 +0000 (0:00:00.600) 0:05:58.884 **** 2025-09-27 00:53:56.635302 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.635307 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.635312 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.635316 | orchestrator | 2025-09-27 00:53:56.635321 | orchestrator | PLAY [Apply role ceph-osd] ***************************************************** 2025-09-27 00:53:56.635326 | orchestrator | 2025-09-27 00:53:56.635331 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-27 00:53:56.635336 | orchestrator | Saturday 27 September 2025 00:49:31 +0000 (0:00:00.533) 0:05:59.418 **** 2025-09-27 00:53:56.635340 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.635349 | orchestrator | 2025-09-27 00:53:56.635354 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-27 00:53:56.635362 | orchestrator | Saturday 27 September 2025 00:49:32 +0000 (0:00:00.678) 0:06:00.096 **** 2025-09-27 00:53:56.635367 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.635372 | orchestrator | 2025-09-27 00:53:56.635376 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-27 00:53:56.635381 | orchestrator | Saturday 27 September 2025 00:49:32 +0000 (0:00:00.493) 0:06:00.590 **** 2025-09-27 00:53:56.635386 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635391 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635396 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635401 | orchestrator | 2025-09-27 00:53:56.635406 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-27 00:53:56.635410 | orchestrator | Saturday 27 September 2025 00:49:33 +0000 (0:00:00.498) 0:06:01.088 **** 2025-09-27 00:53:56.635415 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635420 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635425 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635430 | orchestrator | 2025-09-27 00:53:56.635434 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-27 00:53:56.635439 | orchestrator | Saturday 27 September 2025 00:49:34 +0000 (0:00:00.683) 0:06:01.771 **** 2025-09-27 00:53:56.635444 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635449 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635454 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635458 | orchestrator | 2025-09-27 00:53:56.635463 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-27 00:53:56.635468 | orchestrator | Saturday 27 September 2025 00:49:34 +0000 (0:00:00.747) 0:06:02.519 **** 2025-09-27 00:53:56.635473 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635478 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635482 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635487 | orchestrator | 2025-09-27 00:53:56.635492 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-27 00:53:56.635497 | orchestrator | Saturday 27 September 2025 00:49:35 +0000 (0:00:00.674) 0:06:03.194 **** 2025-09-27 00:53:56.635502 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635506 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635511 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635516 | orchestrator | 2025-09-27 00:53:56.635521 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-27 00:53:56.635526 | orchestrator | Saturday 27 September 2025 00:49:36 +0000 (0:00:00.513) 0:06:03.708 **** 2025-09-27 00:53:56.635530 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635535 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635540 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635545 | orchestrator | 2025-09-27 00:53:56.635549 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-27 00:53:56.635554 | orchestrator | Saturday 27 September 2025 00:49:36 +0000 (0:00:00.320) 0:06:04.029 **** 2025-09-27 00:53:56.635559 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635568 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635573 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635578 | orchestrator | 2025-09-27 00:53:56.635583 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-27 00:53:56.635587 | orchestrator | Saturday 27 September 2025 00:49:36 +0000 (0:00:00.305) 0:06:04.334 **** 2025-09-27 00:53:56.635592 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635597 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635602 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635607 | orchestrator | 2025-09-27 00:53:56.635611 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-27 00:53:56.635620 | orchestrator | Saturday 27 September 2025 00:49:37 +0000 (0:00:00.650) 0:06:04.985 **** 2025-09-27 00:53:56.635625 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635630 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635635 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635639 | orchestrator | 2025-09-27 00:53:56.635644 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-27 00:53:56.635649 | orchestrator | Saturday 27 September 2025 00:49:38 +0000 (0:00:00.868) 0:06:05.853 **** 2025-09-27 00:53:56.635654 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635659 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635664 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635668 | orchestrator | 2025-09-27 00:53:56.635673 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-27 00:53:56.635678 | orchestrator | Saturday 27 September 2025 00:49:38 +0000 (0:00:00.325) 0:06:06.179 **** 2025-09-27 00:53:56.635683 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635687 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635692 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635697 | orchestrator | 2025-09-27 00:53:56.635702 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-27 00:53:56.635706 | orchestrator | Saturday 27 September 2025 00:49:38 +0000 (0:00:00.296) 0:06:06.475 **** 2025-09-27 00:53:56.635711 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635716 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635721 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635725 | orchestrator | 2025-09-27 00:53:56.635730 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-27 00:53:56.635735 | orchestrator | Saturday 27 September 2025 00:49:39 +0000 (0:00:00.308) 0:06:06.784 **** 2025-09-27 00:53:56.635740 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635745 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635749 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635754 | orchestrator | 2025-09-27 00:53:56.635759 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-27 00:53:56.635764 | orchestrator | Saturday 27 September 2025 00:49:39 +0000 (0:00:00.569) 0:06:07.353 **** 2025-09-27 00:53:56.635769 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635773 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635778 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635783 | orchestrator | 2025-09-27 00:53:56.635788 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-27 00:53:56.635792 | orchestrator | Saturday 27 September 2025 00:49:40 +0000 (0:00:00.343) 0:06:07.697 **** 2025-09-27 00:53:56.635800 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635805 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635810 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635815 | orchestrator | 2025-09-27 00:53:56.635820 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-27 00:53:56.635825 | orchestrator | Saturday 27 September 2025 00:49:40 +0000 (0:00:00.324) 0:06:08.021 **** 2025-09-27 00:53:56.635829 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635834 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635839 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635844 | orchestrator | 2025-09-27 00:53:56.635849 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-27 00:53:56.635853 | orchestrator | Saturday 27 September 2025 00:49:40 +0000 (0:00:00.318) 0:06:08.340 **** 2025-09-27 00:53:56.635858 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.635863 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.635868 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.635873 | orchestrator | 2025-09-27 00:53:56.635877 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-27 00:53:56.635882 | orchestrator | Saturday 27 September 2025 00:49:41 +0000 (0:00:00.561) 0:06:08.901 **** 2025-09-27 00:53:56.635890 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635895 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635900 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635905 | orchestrator | 2025-09-27 00:53:56.635909 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-27 00:53:56.635914 | orchestrator | Saturday 27 September 2025 00:49:41 +0000 (0:00:00.399) 0:06:09.301 **** 2025-09-27 00:53:56.635919 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635924 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635929 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635933 | orchestrator | 2025-09-27 00:53:56.635938 | orchestrator | TASK [ceph-osd : Set_fact add_osd] ********************************************* 2025-09-27 00:53:56.635943 | orchestrator | Saturday 27 September 2025 00:49:42 +0000 (0:00:00.541) 0:06:09.842 **** 2025-09-27 00:53:56.635948 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.635953 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.635957 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.635962 | orchestrator | 2025-09-27 00:53:56.635967 | orchestrator | TASK [ceph-osd : Set_fact container_exec_cmd] ********************************** 2025-09-27 00:53:56.635972 | orchestrator | Saturday 27 September 2025 00:49:42 +0000 (0:00:00.342) 0:06:10.186 **** 2025-09-27 00:53:56.635977 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-27 00:53:56.635981 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:53:56.635986 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:53:56.635991 | orchestrator | 2025-09-27 00:53:56.635996 | orchestrator | TASK [ceph-osd : Include_tasks system_tuning.yml] ****************************** 2025-09-27 00:53:56.636004 | orchestrator | Saturday 27 September 2025 00:49:43 +0000 (0:00:01.159) 0:06:11.345 **** 2025-09-27 00:53:56.636008 | orchestrator | included: /ansible/roles/ceph-osd/tasks/system_tuning.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.636013 | orchestrator | 2025-09-27 00:53:56.636018 | orchestrator | TASK [ceph-osd : Create tmpfiles.d directory] ********************************** 2025-09-27 00:53:56.636023 | orchestrator | Saturday 27 September 2025 00:49:44 +0000 (0:00:00.521) 0:06:11.867 **** 2025-09-27 00:53:56.636028 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636032 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636037 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.636042 | orchestrator | 2025-09-27 00:53:56.636047 | orchestrator | TASK [ceph-osd : Disable transparent hugepage] ********************************* 2025-09-27 00:53:56.636051 | orchestrator | Saturday 27 September 2025 00:49:44 +0000 (0:00:00.275) 0:06:12.142 **** 2025-09-27 00:53:56.636056 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636061 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636066 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.636070 | orchestrator | 2025-09-27 00:53:56.636075 | orchestrator | TASK [ceph-osd : Get default vm.min_free_kbytes] ******************************* 2025-09-27 00:53:56.636080 | orchestrator | Saturday 27 September 2025 00:49:44 +0000 (0:00:00.426) 0:06:12.568 **** 2025-09-27 00:53:56.636097 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.636102 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.636106 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.636111 | orchestrator | 2025-09-27 00:53:56.636116 | orchestrator | TASK [ceph-osd : Set_fact vm_min_free_kbytes] ********************************** 2025-09-27 00:53:56.636121 | orchestrator | Saturday 27 September 2025 00:49:45 +0000 (0:00:00.572) 0:06:13.141 **** 2025-09-27 00:53:56.636126 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.636130 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.636135 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.636140 | orchestrator | 2025-09-27 00:53:56.636145 | orchestrator | TASK [ceph-osd : Apply operating system tuning] ******************************** 2025-09-27 00:53:56.636150 | orchestrator | Saturday 27 September 2025 00:49:45 +0000 (0:00:00.301) 0:06:13.443 **** 2025-09-27 00:53:56.636158 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2025-09-27 00:53:56.636162 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2025-09-27 00:53:56.636167 | orchestrator | changed: [testbed-node-5] => (item={'name': 'fs.file-max', 'value': 26234859}) 2025-09-27 00:53:56.636172 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.aio-max-nr', 'value': '1048576', 'enable': True}) 2025-09-27 00:53:56.636177 | orchestrator | changed: [testbed-node-3] => (item={'name': 'fs.file-max', 'value': 26234859}) 2025-09-27 00:53:56.636182 | orchestrator | changed: [testbed-node-4] => (item={'name': 'fs.file-max', 'value': 26234859}) 2025-09-27 00:53:56.636186 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2025-09-27 00:53:56.636194 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2025-09-27 00:53:56.636199 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.zone_reclaim_mode', 'value': 0}) 2025-09-27 00:53:56.636204 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.swappiness', 'value': 10}) 2025-09-27 00:53:56.636209 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.swappiness', 'value': 10}) 2025-09-27 00:53:56.636214 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.swappiness', 'value': 10}) 2025-09-27 00:53:56.636218 | orchestrator | changed: [testbed-node-5] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2025-09-27 00:53:56.636223 | orchestrator | changed: [testbed-node-3] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2025-09-27 00:53:56.636228 | orchestrator | changed: [testbed-node-4] => (item={'name': 'vm.min_free_kbytes', 'value': '67584'}) 2025-09-27 00:53:56.636233 | orchestrator | 2025-09-27 00:53:56.636238 | orchestrator | TASK [ceph-osd : Install dependencies] ***************************************** 2025-09-27 00:53:56.636242 | orchestrator | Saturday 27 September 2025 00:49:48 +0000 (0:00:03.106) 0:06:16.549 **** 2025-09-27 00:53:56.636247 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636252 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636257 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.636262 | orchestrator | 2025-09-27 00:53:56.636266 | orchestrator | TASK [ceph-osd : Include_tasks common.yml] ************************************* 2025-09-27 00:53:56.636271 | orchestrator | Saturday 27 September 2025 00:49:49 +0000 (0:00:00.548) 0:06:17.097 **** 2025-09-27 00:53:56.636276 | orchestrator | included: /ansible/roles/ceph-osd/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.636281 | orchestrator | 2025-09-27 00:53:56.636286 | orchestrator | TASK [ceph-osd : Create bootstrap-osd and osd directories] ********************* 2025-09-27 00:53:56.636290 | orchestrator | Saturday 27 September 2025 00:49:50 +0000 (0:00:00.587) 0:06:17.684 **** 2025-09-27 00:53:56.636295 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-osd/) 2025-09-27 00:53:56.636300 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-osd/) 2025-09-27 00:53:56.636305 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-osd/) 2025-09-27 00:53:56.636309 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/osd/) 2025-09-27 00:53:56.636314 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/osd/) 2025-09-27 00:53:56.636319 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/osd/) 2025-09-27 00:53:56.636324 | orchestrator | 2025-09-27 00:53:56.636329 | orchestrator | TASK [ceph-osd : Get keys from monitors] *************************************** 2025-09-27 00:53:56.636334 | orchestrator | Saturday 27 September 2025 00:49:51 +0000 (0:00:01.018) 0:06:18.703 **** 2025-09-27 00:53:56.636341 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.636346 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-27 00:53:56.636351 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-27 00:53:56.636356 | orchestrator | 2025-09-27 00:53:56.636364 | orchestrator | TASK [ceph-osd : Copy ceph key(s) if needed] *********************************** 2025-09-27 00:53:56.636369 | orchestrator | Saturday 27 September 2025 00:49:53 +0000 (0:00:02.282) 0:06:20.985 **** 2025-09-27 00:53:56.636373 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-27 00:53:56.636378 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-27 00:53:56.636383 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.636388 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-27 00:53:56.636393 | orchestrator | skipping: [testbed-node-4] => (item=None)  2025-09-27 00:53:56.636397 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.636402 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-27 00:53:56.636407 | orchestrator | skipping: [testbed-node-5] => (item=None)  2025-09-27 00:53:56.636412 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.636416 | orchestrator | 2025-09-27 00:53:56.636421 | orchestrator | TASK [ceph-osd : Set noup flag] ************************************************ 2025-09-27 00:53:56.636426 | orchestrator | Saturday 27 September 2025 00:49:54 +0000 (0:00:01.516) 0:06:22.502 **** 2025-09-27 00:53:56.636431 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-27 00:53:56.636436 | orchestrator | 2025-09-27 00:53:56.636440 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm.yml] ****************************** 2025-09-27 00:53:56.636445 | orchestrator | Saturday 27 September 2025 00:49:57 +0000 (0:00:02.250) 0:06:24.752 **** 2025-09-27 00:53:56.636450 | orchestrator | included: /ansible/roles/ceph-osd/tasks/scenarios/lvm.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.636455 | orchestrator | 2025-09-27 00:53:56.636460 | orchestrator | TASK [ceph-osd : Use ceph-volume to create osds] ******************************* 2025-09-27 00:53:56.636465 | orchestrator | Saturday 27 September 2025 00:49:57 +0000 (0:00:00.609) 0:06:25.362 **** 2025-09-27 00:53:56.636469 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-025d8a54-72cd-5dfc-843f-2890244ba468', 'data_vg': 'ceph-025d8a54-72cd-5dfc-843f-2890244ba468'}) 2025-09-27 00:53:56.636475 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-e62f59a6-4044-5e93-b85c-9f8cca280e9f', 'data_vg': 'ceph-e62f59a6-4044-5e93-b85c-9f8cca280e9f'}) 2025-09-27 00:53:56.636479 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06', 'data_vg': 'ceph-03e94b17-8e91-5aba-9ae0-0b9f0a63cf06'}) 2025-09-27 00:53:56.636487 | orchestrator | changed: [testbed-node-3] => (item={'data': 'osd-block-9ca7935d-e986-5962-b530-505e6c7ac609', 'data_vg': 'ceph-9ca7935d-e986-5962-b530-505e6c7ac609'}) 2025-09-27 00:53:56.636492 | orchestrator | changed: [testbed-node-4] => (item={'data': 'osd-block-634a63d2-bd22-5328-9676-28392545ed43', 'data_vg': 'ceph-634a63d2-bd22-5328-9676-28392545ed43'}) 2025-09-27 00:53:56.636497 | orchestrator | changed: [testbed-node-5] => (item={'data': 'osd-block-26537eb5-d37a-51fe-a7ad-0ae3582304de', 'data_vg': 'ceph-26537eb5-d37a-51fe-a7ad-0ae3582304de'}) 2025-09-27 00:53:56.636502 | orchestrator | 2025-09-27 00:53:56.636507 | orchestrator | TASK [ceph-osd : Include_tasks scenarios/lvm-batch.yml] ************************ 2025-09-27 00:53:56.636512 | orchestrator | Saturday 27 September 2025 00:50:39 +0000 (0:00:41.710) 0:07:07.073 **** 2025-09-27 00:53:56.636517 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636521 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636526 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.636531 | orchestrator | 2025-09-27 00:53:56.636536 | orchestrator | TASK [ceph-osd : Include_tasks start_osds.yml] ********************************* 2025-09-27 00:53:56.636540 | orchestrator | Saturday 27 September 2025 00:50:39 +0000 (0:00:00.552) 0:07:07.625 **** 2025-09-27 00:53:56.636545 | orchestrator | included: /ansible/roles/ceph-osd/tasks/start_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.636550 | orchestrator | 2025-09-27 00:53:56.636555 | orchestrator | TASK [ceph-osd : Get osd ids] ************************************************** 2025-09-27 00:53:56.636560 | orchestrator | Saturday 27 September 2025 00:50:40 +0000 (0:00:00.515) 0:07:08.140 **** 2025-09-27 00:53:56.636568 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.636572 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.636577 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.636582 | orchestrator | 2025-09-27 00:53:56.636587 | orchestrator | TASK [ceph-osd : Collect osd ids] ********************************************** 2025-09-27 00:53:56.636592 | orchestrator | Saturday 27 September 2025 00:50:41 +0000 (0:00:00.683) 0:07:08.823 **** 2025-09-27 00:53:56.636596 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.636601 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.636606 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.636611 | orchestrator | 2025-09-27 00:53:56.636616 | orchestrator | TASK [ceph-osd : Include_tasks systemd.yml] ************************************ 2025-09-27 00:53:56.636620 | orchestrator | Saturday 27 September 2025 00:50:44 +0000 (0:00:02.876) 0:07:11.700 **** 2025-09-27 00:53:56.636625 | orchestrator | included: /ansible/roles/ceph-osd/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.636630 | orchestrator | 2025-09-27 00:53:56.636635 | orchestrator | TASK [ceph-osd : Generate systemd unit file] *********************************** 2025-09-27 00:53:56.636640 | orchestrator | Saturday 27 September 2025 00:50:44 +0000 (0:00:00.455) 0:07:12.155 **** 2025-09-27 00:53:56.636644 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.636649 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.636654 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.636659 | orchestrator | 2025-09-27 00:53:56.636666 | orchestrator | TASK [ceph-osd : Generate systemd ceph-osd target file] ************************ 2025-09-27 00:53:56.636671 | orchestrator | Saturday 27 September 2025 00:50:45 +0000 (0:00:01.241) 0:07:13.397 **** 2025-09-27 00:53:56.636676 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.636681 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.636685 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.636690 | orchestrator | 2025-09-27 00:53:56.636695 | orchestrator | TASK [ceph-osd : Enable ceph-osd.target] *************************************** 2025-09-27 00:53:56.636700 | orchestrator | Saturday 27 September 2025 00:50:47 +0000 (0:00:01.405) 0:07:14.803 **** 2025-09-27 00:53:56.636704 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.636709 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.636714 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.636719 | orchestrator | 2025-09-27 00:53:56.636724 | orchestrator | TASK [ceph-osd : Ensure systemd service override directory exists] ************* 2025-09-27 00:53:56.636728 | orchestrator | Saturday 27 September 2025 00:50:49 +0000 (0:00:02.458) 0:07:17.261 **** 2025-09-27 00:53:56.636733 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636738 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636743 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.636747 | orchestrator | 2025-09-27 00:53:56.636752 | orchestrator | TASK [ceph-osd : Add ceph-osd systemd service overrides] *********************** 2025-09-27 00:53:56.636757 | orchestrator | Saturday 27 September 2025 00:50:49 +0000 (0:00:00.311) 0:07:17.573 **** 2025-09-27 00:53:56.636762 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636767 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636771 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.636776 | orchestrator | 2025-09-27 00:53:56.636781 | orchestrator | TASK [ceph-osd : Ensure /var/lib/ceph/osd/- is present] ********* 2025-09-27 00:53:56.636786 | orchestrator | Saturday 27 September 2025 00:50:50 +0000 (0:00:00.319) 0:07:17.892 **** 2025-09-27 00:53:56.636791 | orchestrator | ok: [testbed-node-3] => (item=0) 2025-09-27 00:53:56.636795 | orchestrator | ok: [testbed-node-3] => (item=3) 2025-09-27 00:53:56.636800 | orchestrator | ok: [testbed-node-4] => (item=1) 2025-09-27 00:53:56.636805 | orchestrator | ok: [testbed-node-5] => (item=4) 2025-09-27 00:53:56.636810 | orchestrator | ok: [testbed-node-4] => (item=5) 2025-09-27 00:53:56.636815 | orchestrator | ok: [testbed-node-5] => (item=2) 2025-09-27 00:53:56.636819 | orchestrator | 2025-09-27 00:53:56.636824 | orchestrator | TASK [ceph-osd : Write run file in /var/lib/ceph/osd/xxxx/run] ***************** 2025-09-27 00:53:56.636832 | orchestrator | Saturday 27 September 2025 00:50:51 +0000 (0:00:01.403) 0:07:19.296 **** 2025-09-27 00:53:56.636837 | orchestrator | changed: [testbed-node-3] => (item=0) 2025-09-27 00:53:56.636842 | orchestrator | changed: [testbed-node-5] => (item=4) 2025-09-27 00:53:56.636846 | orchestrator | changed: [testbed-node-4] => (item=1) 2025-09-27 00:53:56.636851 | orchestrator | changed: [testbed-node-3] => (item=3) 2025-09-27 00:53:56.636856 | orchestrator | changed: [testbed-node-5] => (item=2) 2025-09-27 00:53:56.636861 | orchestrator | changed: [testbed-node-4] => (item=5) 2025-09-27 00:53:56.636866 | orchestrator | 2025-09-27 00:53:56.636870 | orchestrator | TASK [ceph-osd : Systemd start osd] ******************************************** 2025-09-27 00:53:56.636878 | orchestrator | Saturday 27 September 2025 00:50:53 +0000 (0:00:02.174) 0:07:21.470 **** 2025-09-27 00:53:56.636883 | orchestrator | changed: [testbed-node-3] => (item=0) 2025-09-27 00:53:56.636888 | orchestrator | changed: [testbed-node-4] => (item=1) 2025-09-27 00:53:56.636893 | orchestrator | changed: [testbed-node-5] => (item=4) 2025-09-27 00:53:56.636898 | orchestrator | changed: [testbed-node-3] => (item=3) 2025-09-27 00:53:56.636903 | orchestrator | changed: [testbed-node-5] => (item=2) 2025-09-27 00:53:56.636907 | orchestrator | changed: [testbed-node-4] => (item=5) 2025-09-27 00:53:56.636912 | orchestrator | 2025-09-27 00:53:56.636917 | orchestrator | TASK [ceph-osd : Unset noup flag] ********************************************** 2025-09-27 00:53:56.636922 | orchestrator | Saturday 27 September 2025 00:50:57 +0000 (0:00:03.464) 0:07:24.935 **** 2025-09-27 00:53:56.636927 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636931 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636936 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2025-09-27 00:53:56.636941 | orchestrator | 2025-09-27 00:53:56.636946 | orchestrator | TASK [ceph-osd : Wait for all osd to be up] ************************************ 2025-09-27 00:53:56.636951 | orchestrator | Saturday 27 September 2025 00:50:59 +0000 (0:00:02.385) 0:07:27.320 **** 2025-09-27 00:53:56.636955 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636960 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636965 | orchestrator | FAILED - RETRYING: [testbed-node-5 -> testbed-node-0]: Wait for all osd to be up (60 retries left). 2025-09-27 00:53:56.636970 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] 2025-09-27 00:53:56.636975 | orchestrator | 2025-09-27 00:53:56.636979 | orchestrator | TASK [ceph-osd : Include crush_rules.yml] ************************************** 2025-09-27 00:53:56.636984 | orchestrator | Saturday 27 September 2025 00:51:12 +0000 (0:00:12.834) 0:07:40.155 **** 2025-09-27 00:53:56.636989 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.636994 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.636999 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637003 | orchestrator | 2025-09-27 00:53:56.637008 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-27 00:53:56.637013 | orchestrator | Saturday 27 September 2025 00:51:13 +0000 (0:00:00.829) 0:07:40.984 **** 2025-09-27 00:53:56.637018 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637023 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.637027 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637032 | orchestrator | 2025-09-27 00:53:56.637037 | orchestrator | RUNNING HANDLER [ceph-handler : Osds handler] ********************************** 2025-09-27 00:53:56.637042 | orchestrator | Saturday 27 September 2025 00:51:13 +0000 (0:00:00.553) 0:07:41.538 **** 2025-09-27 00:53:56.637047 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_osds.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.637051 | orchestrator | 2025-09-27 00:53:56.637056 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact trigger_restart] ********************** 2025-09-27 00:53:56.637064 | orchestrator | Saturday 27 September 2025 00:51:14 +0000 (0:00:00.527) 0:07:42.065 **** 2025-09-27 00:53:56.637069 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.637077 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.637082 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.637111 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637116 | orchestrator | 2025-09-27 00:53:56.637121 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called before restart] ******** 2025-09-27 00:53:56.637126 | orchestrator | Saturday 27 September 2025 00:51:14 +0000 (0:00:00.362) 0:07:42.428 **** 2025-09-27 00:53:56.637131 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637136 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.637141 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637145 | orchestrator | 2025-09-27 00:53:56.637150 | orchestrator | RUNNING HANDLER [ceph-handler : Unset noup flag] ******************************* 2025-09-27 00:53:56.637155 | orchestrator | Saturday 27 September 2025 00:51:15 +0000 (0:00:00.515) 0:07:42.944 **** 2025-09-27 00:53:56.637160 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637164 | orchestrator | 2025-09-27 00:53:56.637169 | orchestrator | RUNNING HANDLER [ceph-handler : Copy osd restart script] *********************** 2025-09-27 00:53:56.637174 | orchestrator | Saturday 27 September 2025 00:51:15 +0000 (0:00:00.263) 0:07:43.207 **** 2025-09-27 00:53:56.637179 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637184 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.637188 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637193 | orchestrator | 2025-09-27 00:53:56.637198 | orchestrator | RUNNING HANDLER [ceph-handler : Get pool list] ********************************* 2025-09-27 00:53:56.637203 | orchestrator | Saturday 27 September 2025 00:51:15 +0000 (0:00:00.306) 0:07:43.514 **** 2025-09-27 00:53:56.637208 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637212 | orchestrator | 2025-09-27 00:53:56.637217 | orchestrator | RUNNING HANDLER [ceph-handler : Get balancer module status] ******************** 2025-09-27 00:53:56.637222 | orchestrator | Saturday 27 September 2025 00:51:16 +0000 (0:00:00.233) 0:07:43.748 **** 2025-09-27 00:53:56.637227 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637232 | orchestrator | 2025-09-27 00:53:56.637236 | orchestrator | RUNNING HANDLER [ceph-handler : Set_fact pools_pgautoscaler_mode] ************** 2025-09-27 00:53:56.637241 | orchestrator | Saturday 27 September 2025 00:51:16 +0000 (0:00:00.225) 0:07:43.974 **** 2025-09-27 00:53:56.637246 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637251 | orchestrator | 2025-09-27 00:53:56.637256 | orchestrator | RUNNING HANDLER [ceph-handler : Disable balancer] ****************************** 2025-09-27 00:53:56.637260 | orchestrator | Saturday 27 September 2025 00:51:16 +0000 (0:00:00.139) 0:07:44.113 **** 2025-09-27 00:53:56.637265 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637270 | orchestrator | 2025-09-27 00:53:56.637275 | orchestrator | RUNNING HANDLER [ceph-handler : Disable pg autoscale on pools] ***************** 2025-09-27 00:53:56.637280 | orchestrator | Saturday 27 September 2025 00:51:16 +0000 (0:00:00.216) 0:07:44.330 **** 2025-09-27 00:53:56.637284 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637289 | orchestrator | 2025-09-27 00:53:56.637393 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph osds daemon(s)] ******************* 2025-09-27 00:53:56.637402 | orchestrator | Saturday 27 September 2025 00:51:16 +0000 (0:00:00.215) 0:07:44.545 **** 2025-09-27 00:53:56.637407 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.637412 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.637417 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.637422 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637427 | orchestrator | 2025-09-27 00:53:56.637432 | orchestrator | RUNNING HANDLER [ceph-handler : Set _osd_handler_called after restart] ********* 2025-09-27 00:53:56.637437 | orchestrator | Saturday 27 September 2025 00:51:17 +0000 (0:00:00.635) 0:07:45.180 **** 2025-09-27 00:53:56.637443 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637448 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.637453 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637462 | orchestrator | 2025-09-27 00:53:56.637467 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable pg autoscale on pools] *************** 2025-09-27 00:53:56.637473 | orchestrator | Saturday 27 September 2025 00:51:18 +0000 (0:00:00.526) 0:07:45.707 **** 2025-09-27 00:53:56.637478 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637483 | orchestrator | 2025-09-27 00:53:56.637488 | orchestrator | RUNNING HANDLER [ceph-handler : Re-enable balancer] **************************** 2025-09-27 00:53:56.637493 | orchestrator | Saturday 27 September 2025 00:51:18 +0000 (0:00:00.220) 0:07:45.927 **** 2025-09-27 00:53:56.637498 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637503 | orchestrator | 2025-09-27 00:53:56.637508 | orchestrator | PLAY [Apply role ceph-crash] *************************************************** 2025-09-27 00:53:56.637513 | orchestrator | 2025-09-27 00:53:56.637518 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-27 00:53:56.637523 | orchestrator | Saturday 27 September 2025 00:51:18 +0000 (0:00:00.632) 0:07:46.559 **** 2025-09-27 00:53:56.637528 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.637534 | orchestrator | 2025-09-27 00:53:56.637539 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-27 00:53:56.637544 | orchestrator | Saturday 27 September 2025 00:51:20 +0000 (0:00:01.187) 0:07:47.747 **** 2025-09-27 00:53:56.637549 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.637554 | orchestrator | 2025-09-27 00:53:56.637559 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-27 00:53:56.637564 | orchestrator | Saturday 27 September 2025 00:51:21 +0000 (0:00:01.174) 0:07:48.921 **** 2025-09-27 00:53:56.637569 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637574 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.637583 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.637588 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.637593 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.637598 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637603 | orchestrator | 2025-09-27 00:53:56.637608 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-27 00:53:56.637613 | orchestrator | Saturday 27 September 2025 00:51:22 +0000 (0:00:00.818) 0:07:49.740 **** 2025-09-27 00:53:56.637618 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.637623 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.637628 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.637633 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.637638 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.637643 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.637648 | orchestrator | 2025-09-27 00:53:56.637653 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-27 00:53:56.637657 | orchestrator | Saturday 27 September 2025 00:51:23 +0000 (0:00:00.989) 0:07:50.730 **** 2025-09-27 00:53:56.637662 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.637667 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.637671 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.637676 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.637681 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.637686 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.637690 | orchestrator | 2025-09-27 00:53:56.637695 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-27 00:53:56.637700 | orchestrator | Saturday 27 September 2025 00:51:24 +0000 (0:00:01.246) 0:07:51.976 **** 2025-09-27 00:53:56.637705 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.637709 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.637714 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.637719 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.637729 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.637734 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.637739 | orchestrator | 2025-09-27 00:53:56.637743 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-27 00:53:56.637748 | orchestrator | Saturday 27 September 2025 00:51:25 +0000 (0:00:00.975) 0:07:52.952 **** 2025-09-27 00:53:56.637753 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637758 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.637762 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.637767 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.637772 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.637776 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637781 | orchestrator | 2025-09-27 00:53:56.637786 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-27 00:53:56.637791 | orchestrator | Saturday 27 September 2025 00:51:26 +0000 (0:00:00.836) 0:07:53.788 **** 2025-09-27 00:53:56.637795 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.637800 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.637805 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.637809 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637814 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.637819 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637824 | orchestrator | 2025-09-27 00:53:56.637842 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-27 00:53:56.637848 | orchestrator | Saturday 27 September 2025 00:51:26 +0000 (0:00:00.578) 0:07:54.367 **** 2025-09-27 00:53:56.637853 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.637857 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.637862 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.637867 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637871 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.637876 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.637881 | orchestrator | 2025-09-27 00:53:56.637885 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-27 00:53:56.637890 | orchestrator | Saturday 27 September 2025 00:51:27 +0000 (0:00:00.727) 0:07:55.095 **** 2025-09-27 00:53:56.637895 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.637900 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.637904 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.637909 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.637914 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.637918 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.637923 | orchestrator | 2025-09-27 00:53:56.637928 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-27 00:53:56.637932 | orchestrator | Saturday 27 September 2025 00:51:28 +0000 (0:00:00.987) 0:07:56.082 **** 2025-09-27 00:53:56.637937 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.637942 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.637946 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.637951 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.637956 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.637960 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.637965 | orchestrator | 2025-09-27 00:53:56.637969 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-27 00:53:56.637974 | orchestrator | Saturday 27 September 2025 00:51:29 +0000 (0:00:01.211) 0:07:57.294 **** 2025-09-27 00:53:56.637979 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.637984 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.637988 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.637993 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.637997 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.638002 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.638008 | orchestrator | 2025-09-27 00:53:56.638027 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-27 00:53:56.638039 | orchestrator | Saturday 27 September 2025 00:51:30 +0000 (0:00:00.579) 0:07:57.874 **** 2025-09-27 00:53:56.638045 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.638050 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.638055 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.638060 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.638065 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.638070 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.638076 | orchestrator | 2025-09-27 00:53:56.638081 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-27 00:53:56.638096 | orchestrator | Saturday 27 September 2025 00:51:30 +0000 (0:00:00.593) 0:07:58.468 **** 2025-09-27 00:53:56.638101 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.638109 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.638115 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.638120 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638125 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638130 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638136 | orchestrator | 2025-09-27 00:53:56.638141 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-27 00:53:56.638146 | orchestrator | Saturday 27 September 2025 00:51:31 +0000 (0:00:00.838) 0:07:59.306 **** 2025-09-27 00:53:56.638152 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.638157 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.638162 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.638167 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638173 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638178 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638184 | orchestrator | 2025-09-27 00:53:56.638189 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-27 00:53:56.638194 | orchestrator | Saturday 27 September 2025 00:51:32 +0000 (0:00:00.576) 0:07:59.883 **** 2025-09-27 00:53:56.638199 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.638204 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.638208 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.638213 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638217 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638222 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638226 | orchestrator | 2025-09-27 00:53:56.638231 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-27 00:53:56.638235 | orchestrator | Saturday 27 September 2025 00:51:33 +0000 (0:00:00.840) 0:08:00.724 **** 2025-09-27 00:53:56.638240 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.638244 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.638249 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.638253 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.638257 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.638262 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.638266 | orchestrator | 2025-09-27 00:53:56.638271 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-27 00:53:56.638276 | orchestrator | Saturday 27 September 2025 00:51:33 +0000 (0:00:00.596) 0:08:01.320 **** 2025-09-27 00:53:56.638280 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.638284 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.638289 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.638293 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.638298 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.638302 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.638307 | orchestrator | 2025-09-27 00:53:56.638311 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-27 00:53:56.638316 | orchestrator | Saturday 27 September 2025 00:51:34 +0000 (0:00:00.700) 0:08:02.020 **** 2025-09-27 00:53:56.638320 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.638325 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.638329 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.638338 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.638343 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.638348 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.638352 | orchestrator | 2025-09-27 00:53:56.638371 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-27 00:53:56.638377 | orchestrator | Saturday 27 September 2025 00:51:34 +0000 (0:00:00.518) 0:08:02.539 **** 2025-09-27 00:53:56.638381 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.638386 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.638390 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.638395 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638399 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638404 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638408 | orchestrator | 2025-09-27 00:53:56.638413 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-27 00:53:56.638417 | orchestrator | Saturday 27 September 2025 00:51:35 +0000 (0:00:00.694) 0:08:03.233 **** 2025-09-27 00:53:56.638422 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.638426 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.638431 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.638435 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638439 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638444 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638448 | orchestrator | 2025-09-27 00:53:56.638453 | orchestrator | TASK [ceph-crash : Create client.crash keyring] ******************************** 2025-09-27 00:53:56.638457 | orchestrator | Saturday 27 September 2025 00:51:36 +0000 (0:00:01.062) 0:08:04.295 **** 2025-09-27 00:53:56.638462 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.638466 | orchestrator | 2025-09-27 00:53:56.638471 | orchestrator | TASK [ceph-crash : Get keys from monitors] ************************************* 2025-09-27 00:53:56.638476 | orchestrator | Saturday 27 September 2025 00:51:40 +0000 (0:00:04.125) 0:08:08.421 **** 2025-09-27 00:53:56.638480 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.638484 | orchestrator | 2025-09-27 00:53:56.638489 | orchestrator | TASK [ceph-crash : Copy ceph key(s) if needed] ********************************* 2025-09-27 00:53:56.638494 | orchestrator | Saturday 27 September 2025 00:51:42 +0000 (0:00:02.146) 0:08:10.568 **** 2025-09-27 00:53:56.638498 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.638503 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.638507 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.638512 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.638516 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.638521 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.638525 | orchestrator | 2025-09-27 00:53:56.638530 | orchestrator | TASK [ceph-crash : Create /var/lib/ceph/crash/posted] ************************** 2025-09-27 00:53:56.638534 | orchestrator | Saturday 27 September 2025 00:51:44 +0000 (0:00:01.623) 0:08:12.192 **** 2025-09-27 00:53:56.638539 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.638543 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.638548 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.638552 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.638557 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.638561 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.638566 | orchestrator | 2025-09-27 00:53:56.638570 | orchestrator | TASK [ceph-crash : Include_tasks systemd.yml] ********************************** 2025-09-27 00:53:56.638575 | orchestrator | Saturday 27 September 2025 00:51:45 +0000 (0:00:00.995) 0:08:13.188 **** 2025-09-27 00:53:56.638582 | orchestrator | included: /ansible/roles/ceph-crash/tasks/systemd.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.638587 | orchestrator | 2025-09-27 00:53:56.638592 | orchestrator | TASK [ceph-crash : Generate systemd unit file for ceph-crash container] ******** 2025-09-27 00:53:56.638596 | orchestrator | Saturday 27 September 2025 00:51:46 +0000 (0:00:01.179) 0:08:14.367 **** 2025-09-27 00:53:56.638601 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.638609 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.638614 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.638618 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.638623 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.638627 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.638632 | orchestrator | 2025-09-27 00:53:56.638636 | orchestrator | TASK [ceph-crash : Start the ceph-crash service] ******************************* 2025-09-27 00:53:56.638641 | orchestrator | Saturday 27 September 2025 00:51:48 +0000 (0:00:01.575) 0:08:15.943 **** 2025-09-27 00:53:56.638645 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.638650 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.638654 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.638659 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.638663 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.638668 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.638672 | orchestrator | 2025-09-27 00:53:56.638677 | orchestrator | RUNNING HANDLER [ceph-handler : Ceph crash handler] **************************** 2025-09-27 00:53:56.638681 | orchestrator | Saturday 27 September 2025 00:51:51 +0000 (0:00:03.421) 0:08:19.365 **** 2025-09-27 00:53:56.638686 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_crash.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.638691 | orchestrator | 2025-09-27 00:53:56.638695 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called before restart] ****** 2025-09-27 00:53:56.638700 | orchestrator | Saturday 27 September 2025 00:51:52 +0000 (0:00:01.233) 0:08:20.599 **** 2025-09-27 00:53:56.638704 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.638709 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.638713 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.638718 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638722 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638727 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638731 | orchestrator | 2025-09-27 00:53:56.638736 | orchestrator | RUNNING HANDLER [ceph-handler : Restart the ceph-crash service] **************** 2025-09-27 00:53:56.638740 | orchestrator | Saturday 27 September 2025 00:51:53 +0000 (0:00:00.608) 0:08:21.207 **** 2025-09-27 00:53:56.638745 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.638749 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.638754 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.638758 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.638763 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.638767 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.638772 | orchestrator | 2025-09-27 00:53:56.638776 | orchestrator | RUNNING HANDLER [ceph-handler : Set _crash_handler_called after restart] ******* 2025-09-27 00:53:56.638794 | orchestrator | Saturday 27 September 2025 00:51:55 +0000 (0:00:02.424) 0:08:23.632 **** 2025-09-27 00:53:56.638799 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.638804 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.638808 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.638813 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638817 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638821 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638826 | orchestrator | 2025-09-27 00:53:56.638831 | orchestrator | PLAY [Apply role ceph-mds] ***************************************************** 2025-09-27 00:53:56.638835 | orchestrator | 2025-09-27 00:53:56.638839 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-27 00:53:56.638844 | orchestrator | Saturday 27 September 2025 00:51:57 +0000 (0:00:01.013) 0:08:24.645 **** 2025-09-27 00:53:56.638849 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.638853 | orchestrator | 2025-09-27 00:53:56.638858 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-27 00:53:56.638862 | orchestrator | Saturday 27 September 2025 00:51:57 +0000 (0:00:00.505) 0:08:25.151 **** 2025-09-27 00:53:56.638870 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.638874 | orchestrator | 2025-09-27 00:53:56.638879 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-27 00:53:56.638883 | orchestrator | Saturday 27 September 2025 00:51:58 +0000 (0:00:00.661) 0:08:25.813 **** 2025-09-27 00:53:56.638888 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.638892 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.638897 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.638901 | orchestrator | 2025-09-27 00:53:56.638906 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-27 00:53:56.638910 | orchestrator | Saturday 27 September 2025 00:51:58 +0000 (0:00:00.313) 0:08:26.126 **** 2025-09-27 00:53:56.638915 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638920 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638924 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638929 | orchestrator | 2025-09-27 00:53:56.638933 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-27 00:53:56.638938 | orchestrator | Saturday 27 September 2025 00:51:59 +0000 (0:00:00.712) 0:08:26.839 **** 2025-09-27 00:53:56.638942 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638947 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638951 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638955 | orchestrator | 2025-09-27 00:53:56.638960 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-27 00:53:56.638965 | orchestrator | Saturday 27 September 2025 00:51:59 +0000 (0:00:00.711) 0:08:27.550 **** 2025-09-27 00:53:56.638969 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.638974 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.638978 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.638983 | orchestrator | 2025-09-27 00:53:56.638990 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-27 00:53:56.638995 | orchestrator | Saturday 27 September 2025 00:52:00 +0000 (0:00:01.011) 0:08:28.562 **** 2025-09-27 00:53:56.638999 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639004 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639008 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639013 | orchestrator | 2025-09-27 00:53:56.639017 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-27 00:53:56.639022 | orchestrator | Saturday 27 September 2025 00:52:01 +0000 (0:00:00.313) 0:08:28.875 **** 2025-09-27 00:53:56.639026 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639031 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639035 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639039 | orchestrator | 2025-09-27 00:53:56.639044 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-27 00:53:56.639048 | orchestrator | Saturday 27 September 2025 00:52:01 +0000 (0:00:00.301) 0:08:29.176 **** 2025-09-27 00:53:56.639053 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639057 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639062 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639066 | orchestrator | 2025-09-27 00:53:56.639071 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-27 00:53:56.639075 | orchestrator | Saturday 27 September 2025 00:52:01 +0000 (0:00:00.280) 0:08:29.457 **** 2025-09-27 00:53:56.639080 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639084 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639113 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639118 | orchestrator | 2025-09-27 00:53:56.639123 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-27 00:53:56.639127 | orchestrator | Saturday 27 September 2025 00:52:02 +0000 (0:00:01.012) 0:08:30.469 **** 2025-09-27 00:53:56.639132 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639136 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639144 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639149 | orchestrator | 2025-09-27 00:53:56.639153 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-27 00:53:56.639158 | orchestrator | Saturday 27 September 2025 00:52:03 +0000 (0:00:00.706) 0:08:31.175 **** 2025-09-27 00:53:56.639163 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639167 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639172 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639176 | orchestrator | 2025-09-27 00:53:56.639181 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-27 00:53:56.639185 | orchestrator | Saturday 27 September 2025 00:52:03 +0000 (0:00:00.295) 0:08:31.471 **** 2025-09-27 00:53:56.639190 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639194 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639199 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639203 | orchestrator | 2025-09-27 00:53:56.639207 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-27 00:53:56.639212 | orchestrator | Saturday 27 September 2025 00:52:04 +0000 (0:00:00.282) 0:08:31.754 **** 2025-09-27 00:53:56.639231 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639236 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639241 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639245 | orchestrator | 2025-09-27 00:53:56.639250 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-27 00:53:56.639254 | orchestrator | Saturday 27 September 2025 00:52:04 +0000 (0:00:00.556) 0:08:32.310 **** 2025-09-27 00:53:56.639259 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639263 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639268 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639272 | orchestrator | 2025-09-27 00:53:56.639277 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-27 00:53:56.639281 | orchestrator | Saturday 27 September 2025 00:52:04 +0000 (0:00:00.325) 0:08:32.636 **** 2025-09-27 00:53:56.639286 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639290 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639295 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639299 | orchestrator | 2025-09-27 00:53:56.639304 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-27 00:53:56.639308 | orchestrator | Saturday 27 September 2025 00:52:05 +0000 (0:00:00.313) 0:08:32.949 **** 2025-09-27 00:53:56.639313 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639317 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639322 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639326 | orchestrator | 2025-09-27 00:53:56.639331 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-27 00:53:56.639335 | orchestrator | Saturday 27 September 2025 00:52:05 +0000 (0:00:00.302) 0:08:33.252 **** 2025-09-27 00:53:56.639340 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639344 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639349 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639353 | orchestrator | 2025-09-27 00:53:56.639358 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-27 00:53:56.639362 | orchestrator | Saturday 27 September 2025 00:52:06 +0000 (0:00:00.513) 0:08:33.765 **** 2025-09-27 00:53:56.639367 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639371 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639376 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639380 | orchestrator | 2025-09-27 00:53:56.639385 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-27 00:53:56.639389 | orchestrator | Saturday 27 September 2025 00:52:06 +0000 (0:00:00.294) 0:08:34.060 **** 2025-09-27 00:53:56.639394 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639398 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639403 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639407 | orchestrator | 2025-09-27 00:53:56.639415 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-27 00:53:56.639420 | orchestrator | Saturday 27 September 2025 00:52:06 +0000 (0:00:00.328) 0:08:34.388 **** 2025-09-27 00:53:56.639424 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639429 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639433 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639438 | orchestrator | 2025-09-27 00:53:56.639445 | orchestrator | TASK [ceph-mds : Include create_mds_filesystems.yml] *************************** 2025-09-27 00:53:56.639450 | orchestrator | Saturday 27 September 2025 00:52:07 +0000 (0:00:00.721) 0:08:35.110 **** 2025-09-27 00:53:56.639454 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639459 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639463 | orchestrator | included: /ansible/roles/ceph-mds/tasks/create_mds_filesystems.yml for testbed-node-3 2025-09-27 00:53:56.639468 | orchestrator | 2025-09-27 00:53:56.639472 | orchestrator | TASK [ceph-facts : Get current default crush rule details] ********************* 2025-09-27 00:53:56.639477 | orchestrator | Saturday 27 September 2025 00:52:07 +0000 (0:00:00.408) 0:08:35.518 **** 2025-09-27 00:53:56.639481 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-27 00:53:56.639486 | orchestrator | 2025-09-27 00:53:56.639491 | orchestrator | TASK [ceph-facts : Get current default crush rule name] ************************ 2025-09-27 00:53:56.639495 | orchestrator | Saturday 27 September 2025 00:52:09 +0000 (0:00:02.117) 0:08:37.635 **** 2025-09-27 00:53:56.639500 | orchestrator | skipping: [testbed-node-3] => (item={'rule_id': 0, 'rule_name': 'replicated_rule', 'type': 1, 'steps': [{'op': 'take', 'item': -1, 'item_name': 'default'}, {'op': 'chooseleaf_firstn', 'num': 0, 'type': 'host'}, {'op': 'emit'}]})  2025-09-27 00:53:56.639506 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639511 | orchestrator | 2025-09-27 00:53:56.639515 | orchestrator | TASK [ceph-mds : Create filesystem pools] ************************************** 2025-09-27 00:53:56.639520 | orchestrator | Saturday 27 September 2025 00:52:10 +0000 (0:00:00.228) 0:08:37.864 **** 2025-09-27 00:53:56.639525 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_data', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-27 00:53:56.639535 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'application': 'cephfs', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'cephfs_metadata', 'pg_num': 16, 'pgp_num': 16, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-27 00:53:56.639539 | orchestrator | 2025-09-27 00:53:56.639544 | orchestrator | TASK [ceph-mds : Create ceph filesystem] *************************************** 2025-09-27 00:53:56.639548 | orchestrator | Saturday 27 September 2025 00:52:19 +0000 (0:00:08.849) 0:08:46.714 **** 2025-09-27 00:53:56.639553 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] 2025-09-27 00:53:56.639557 | orchestrator | 2025-09-27 00:53:56.639562 | orchestrator | TASK [ceph-mds : Include common.yml] ******************************************* 2025-09-27 00:53:56.639566 | orchestrator | Saturday 27 September 2025 00:52:22 +0000 (0:00:03.712) 0:08:50.426 **** 2025-09-27 00:53:56.639583 | orchestrator | included: /ansible/roles/ceph-mds/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.639589 | orchestrator | 2025-09-27 00:53:56.639593 | orchestrator | TASK [ceph-mds : Create bootstrap-mds and mds directories] ********************* 2025-09-27 00:53:56.639598 | orchestrator | Saturday 27 September 2025 00:52:23 +0000 (0:00:00.735) 0:08:51.161 **** 2025-09-27 00:53:56.639602 | orchestrator | ok: [testbed-node-3] => (item=/var/lib/ceph/bootstrap-mds/) 2025-09-27 00:53:56.639607 | orchestrator | ok: [testbed-node-4] => (item=/var/lib/ceph/bootstrap-mds/) 2025-09-27 00:53:56.639611 | orchestrator | ok: [testbed-node-5] => (item=/var/lib/ceph/bootstrap-mds/) 2025-09-27 00:53:56.639616 | orchestrator | changed: [testbed-node-3] => (item=/var/lib/ceph/mds/ceph-testbed-node-3) 2025-09-27 00:53:56.639624 | orchestrator | changed: [testbed-node-4] => (item=/var/lib/ceph/mds/ceph-testbed-node-4) 2025-09-27 00:53:56.639628 | orchestrator | changed: [testbed-node-5] => (item=/var/lib/ceph/mds/ceph-testbed-node-5) 2025-09-27 00:53:56.639633 | orchestrator | 2025-09-27 00:53:56.639637 | orchestrator | TASK [ceph-mds : Get keys from monitors] *************************************** 2025-09-27 00:53:56.639642 | orchestrator | Saturday 27 September 2025 00:52:24 +0000 (0:00:01.086) 0:08:52.248 **** 2025-09-27 00:53:56.639646 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.639651 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-27 00:53:56.639655 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-27 00:53:56.639660 | orchestrator | 2025-09-27 00:53:56.639664 | orchestrator | TASK [ceph-mds : Copy ceph key(s) if needed] *********************************** 2025-09-27 00:53:56.639668 | orchestrator | Saturday 27 September 2025 00:52:26 +0000 (0:00:02.196) 0:08:54.444 **** 2025-09-27 00:53:56.639672 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-27 00:53:56.639676 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-27 00:53:56.639680 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.639684 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-27 00:53:56.639688 | orchestrator | skipping: [testbed-node-4] => (item=None)  2025-09-27 00:53:56.639692 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.639696 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-27 00:53:56.639700 | orchestrator | skipping: [testbed-node-5] => (item=None)  2025-09-27 00:53:56.639704 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.639708 | orchestrator | 2025-09-27 00:53:56.639713 | orchestrator | TASK [ceph-mds : Create mds keyring] ******************************************* 2025-09-27 00:53:56.639717 | orchestrator | Saturday 27 September 2025 00:52:28 +0000 (0:00:01.235) 0:08:55.680 **** 2025-09-27 00:53:56.639721 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.639725 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.639729 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.639733 | orchestrator | 2025-09-27 00:53:56.639740 | orchestrator | TASK [ceph-mds : Non_containerized.yml] **************************************** 2025-09-27 00:53:56.639744 | orchestrator | Saturday 27 September 2025 00:52:30 +0000 (0:00:02.949) 0:08:58.629 **** 2025-09-27 00:53:56.639748 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.639752 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.639756 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.639760 | orchestrator | 2025-09-27 00:53:56.639764 | orchestrator | TASK [ceph-mds : Containerized.yml] ******************************************** 2025-09-27 00:53:56.639768 | orchestrator | Saturday 27 September 2025 00:52:31 +0000 (0:00:00.302) 0:08:58.931 **** 2025-09-27 00:53:56.639773 | orchestrator | included: /ansible/roles/ceph-mds/tasks/containerized.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.639777 | orchestrator | 2025-09-27 00:53:56.639781 | orchestrator | TASK [ceph-mds : Include_tasks systemd.yml] ************************************ 2025-09-27 00:53:56.639785 | orchestrator | Saturday 27 September 2025 00:52:31 +0000 (0:00:00.503) 0:08:59.435 **** 2025-09-27 00:53:56.639789 | orchestrator | included: /ansible/roles/ceph-mds/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.639793 | orchestrator | 2025-09-27 00:53:56.639797 | orchestrator | TASK [ceph-mds : Generate systemd unit file] *********************************** 2025-09-27 00:53:56.639801 | orchestrator | Saturday 27 September 2025 00:52:32 +0000 (0:00:00.748) 0:09:00.183 **** 2025-09-27 00:53:56.639805 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.639809 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.639813 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.639817 | orchestrator | 2025-09-27 00:53:56.639821 | orchestrator | TASK [ceph-mds : Generate systemd ceph-mds target file] ************************ 2025-09-27 00:53:56.639826 | orchestrator | Saturday 27 September 2025 00:52:33 +0000 (0:00:01.259) 0:09:01.443 **** 2025-09-27 00:53:56.639833 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.639837 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.639841 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.639845 | orchestrator | 2025-09-27 00:53:56.639849 | orchestrator | TASK [ceph-mds : Enable ceph-mds.target] *************************************** 2025-09-27 00:53:56.639853 | orchestrator | Saturday 27 September 2025 00:52:35 +0000 (0:00:01.228) 0:09:02.672 **** 2025-09-27 00:53:56.639857 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.639861 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.639865 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.639869 | orchestrator | 2025-09-27 00:53:56.639873 | orchestrator | TASK [ceph-mds : Systemd start mds container] ********************************** 2025-09-27 00:53:56.639877 | orchestrator | Saturday 27 September 2025 00:52:37 +0000 (0:00:02.060) 0:09:04.732 **** 2025-09-27 00:53:56.639881 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.639885 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.639890 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.639894 | orchestrator | 2025-09-27 00:53:56.639898 | orchestrator | TASK [ceph-mds : Wait for mds socket to exist] ********************************* 2025-09-27 00:53:56.639902 | orchestrator | Saturday 27 September 2025 00:52:39 +0000 (0:00:01.987) 0:09:06.720 **** 2025-09-27 00:53:56.639906 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639921 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639926 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639930 | orchestrator | 2025-09-27 00:53:56.639934 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-27 00:53:56.639938 | orchestrator | Saturday 27 September 2025 00:52:40 +0000 (0:00:01.519) 0:09:08.240 **** 2025-09-27 00:53:56.639943 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.639947 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.639951 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.639955 | orchestrator | 2025-09-27 00:53:56.639959 | orchestrator | RUNNING HANDLER [ceph-handler : Mdss handler] ********************************** 2025-09-27 00:53:56.639963 | orchestrator | Saturday 27 September 2025 00:52:41 +0000 (0:00:00.701) 0:09:08.942 **** 2025-09-27 00:53:56.639967 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_mdss.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.639971 | orchestrator | 2025-09-27 00:53:56.639975 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called before restart] ******** 2025-09-27 00:53:56.639979 | orchestrator | Saturday 27 September 2025 00:52:41 +0000 (0:00:00.502) 0:09:09.445 **** 2025-09-27 00:53:56.639983 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.639987 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.639992 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.639996 | orchestrator | 2025-09-27 00:53:56.640000 | orchestrator | RUNNING HANDLER [ceph-handler : Copy mds restart script] *********************** 2025-09-27 00:53:56.640004 | orchestrator | Saturday 27 September 2025 00:52:42 +0000 (0:00:00.611) 0:09:10.056 **** 2025-09-27 00:53:56.640008 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.640012 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.640016 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.640020 | orchestrator | 2025-09-27 00:53:56.640024 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph mds daemon(s)] ******************** 2025-09-27 00:53:56.640028 | orchestrator | Saturday 27 September 2025 00:52:43 +0000 (0:00:01.212) 0:09:11.268 **** 2025-09-27 00:53:56.640032 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.640036 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.640041 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.640045 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640049 | orchestrator | 2025-09-27 00:53:56.640053 | orchestrator | RUNNING HANDLER [ceph-handler : Set _mds_handler_called after restart] ********* 2025-09-27 00:53:56.640057 | orchestrator | Saturday 27 September 2025 00:52:44 +0000 (0:00:00.650) 0:09:11.919 **** 2025-09-27 00:53:56.640065 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640069 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640073 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640077 | orchestrator | 2025-09-27 00:53:56.640082 | orchestrator | PLAY [Apply role ceph-rgw] ***************************************************** 2025-09-27 00:53:56.640096 | orchestrator | 2025-09-27 00:53:56.640100 | orchestrator | TASK [ceph-handler : Include check_running_cluster.yml] ************************ 2025-09-27 00:53:56.640107 | orchestrator | Saturday 27 September 2025 00:52:44 +0000 (0:00:00.565) 0:09:12.484 **** 2025-09-27 00:53:56.640111 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_cluster.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.640115 | orchestrator | 2025-09-27 00:53:56.640119 | orchestrator | TASK [ceph-handler : Include check_running_containers.yml] ********************* 2025-09-27 00:53:56.640123 | orchestrator | Saturday 27 September 2025 00:52:45 +0000 (0:00:00.816) 0:09:13.300 **** 2025-09-27 00:53:56.640127 | orchestrator | included: /ansible/roles/ceph-handler/tasks/check_running_containers.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.640132 | orchestrator | 2025-09-27 00:53:56.640136 | orchestrator | TASK [ceph-handler : Check for a mon container] ******************************** 2025-09-27 00:53:56.640140 | orchestrator | Saturday 27 September 2025 00:52:46 +0000 (0:00:00.508) 0:09:13.809 **** 2025-09-27 00:53:56.640144 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640148 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640152 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640156 | orchestrator | 2025-09-27 00:53:56.640160 | orchestrator | TASK [ceph-handler : Check for an osd container] ******************************* 2025-09-27 00:53:56.640164 | orchestrator | Saturday 27 September 2025 00:52:46 +0000 (0:00:00.527) 0:09:14.337 **** 2025-09-27 00:53:56.640168 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640172 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640176 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640180 | orchestrator | 2025-09-27 00:53:56.640184 | orchestrator | TASK [ceph-handler : Check for a mds container] ******************************** 2025-09-27 00:53:56.640189 | orchestrator | Saturday 27 September 2025 00:52:47 +0000 (0:00:00.702) 0:09:15.039 **** 2025-09-27 00:53:56.640193 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640197 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640201 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640205 | orchestrator | 2025-09-27 00:53:56.640209 | orchestrator | TASK [ceph-handler : Check for a rgw container] ******************************** 2025-09-27 00:53:56.640213 | orchestrator | Saturday 27 September 2025 00:52:48 +0000 (0:00:00.738) 0:09:15.777 **** 2025-09-27 00:53:56.640217 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640221 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640225 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640229 | orchestrator | 2025-09-27 00:53:56.640233 | orchestrator | TASK [ceph-handler : Check for a mgr container] ******************************** 2025-09-27 00:53:56.640238 | orchestrator | Saturday 27 September 2025 00:52:48 +0000 (0:00:00.668) 0:09:16.446 **** 2025-09-27 00:53:56.640242 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640246 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640250 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640254 | orchestrator | 2025-09-27 00:53:56.640258 | orchestrator | TASK [ceph-handler : Check for a rbd mirror container] ************************* 2025-09-27 00:53:56.640262 | orchestrator | Saturday 27 September 2025 00:52:49 +0000 (0:00:00.575) 0:09:17.022 **** 2025-09-27 00:53:56.640266 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640270 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640274 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640278 | orchestrator | 2025-09-27 00:53:56.640294 | orchestrator | TASK [ceph-handler : Check for a nfs container] ******************************** 2025-09-27 00:53:56.640299 | orchestrator | Saturday 27 September 2025 00:52:49 +0000 (0:00:00.314) 0:09:17.337 **** 2025-09-27 00:53:56.640308 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640312 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640316 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640321 | orchestrator | 2025-09-27 00:53:56.640325 | orchestrator | TASK [ceph-handler : Check for a ceph-crash container] ************************* 2025-09-27 00:53:56.640329 | orchestrator | Saturday 27 September 2025 00:52:50 +0000 (0:00:00.311) 0:09:17.648 **** 2025-09-27 00:53:56.640333 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640337 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640341 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640345 | orchestrator | 2025-09-27 00:53:56.640349 | orchestrator | TASK [ceph-handler : Check for a ceph-exporter container] ********************** 2025-09-27 00:53:56.640353 | orchestrator | Saturday 27 September 2025 00:52:50 +0000 (0:00:00.727) 0:09:18.376 **** 2025-09-27 00:53:56.640357 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640361 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640365 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640369 | orchestrator | 2025-09-27 00:53:56.640374 | orchestrator | TASK [ceph-handler : Include check_socket_non_container.yml] ******************* 2025-09-27 00:53:56.640378 | orchestrator | Saturday 27 September 2025 00:52:51 +0000 (0:00:00.992) 0:09:19.369 **** 2025-09-27 00:53:56.640382 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640386 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640390 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640394 | orchestrator | 2025-09-27 00:53:56.640398 | orchestrator | TASK [ceph-handler : Set_fact handler_mon_status] ****************************** 2025-09-27 00:53:56.640402 | orchestrator | Saturday 27 September 2025 00:52:52 +0000 (0:00:00.306) 0:09:19.676 **** 2025-09-27 00:53:56.640406 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640410 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640414 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640418 | orchestrator | 2025-09-27 00:53:56.640422 | orchestrator | TASK [ceph-handler : Set_fact handler_osd_status] ****************************** 2025-09-27 00:53:56.640427 | orchestrator | Saturday 27 September 2025 00:52:52 +0000 (0:00:00.305) 0:09:19.981 **** 2025-09-27 00:53:56.640431 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640435 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640439 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640443 | orchestrator | 2025-09-27 00:53:56.640447 | orchestrator | TASK [ceph-handler : Set_fact handler_mds_status] ****************************** 2025-09-27 00:53:56.640451 | orchestrator | Saturday 27 September 2025 00:52:52 +0000 (0:00:00.339) 0:09:20.321 **** 2025-09-27 00:53:56.640455 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640459 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640463 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640467 | orchestrator | 2025-09-27 00:53:56.640471 | orchestrator | TASK [ceph-handler : Set_fact handler_rgw_status] ****************************** 2025-09-27 00:53:56.640475 | orchestrator | Saturday 27 September 2025 00:52:53 +0000 (0:00:00.564) 0:09:20.885 **** 2025-09-27 00:53:56.640482 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640486 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640490 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640494 | orchestrator | 2025-09-27 00:53:56.640498 | orchestrator | TASK [ceph-handler : Set_fact handler_nfs_status] ****************************** 2025-09-27 00:53:56.640503 | orchestrator | Saturday 27 September 2025 00:52:53 +0000 (0:00:00.332) 0:09:21.218 **** 2025-09-27 00:53:56.640507 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640511 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640515 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640519 | orchestrator | 2025-09-27 00:53:56.640523 | orchestrator | TASK [ceph-handler : Set_fact handler_rbd_status] ****************************** 2025-09-27 00:53:56.640527 | orchestrator | Saturday 27 September 2025 00:52:53 +0000 (0:00:00.304) 0:09:21.523 **** 2025-09-27 00:53:56.640531 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640535 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640543 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640547 | orchestrator | 2025-09-27 00:53:56.640551 | orchestrator | TASK [ceph-handler : Set_fact handler_mgr_status] ****************************** 2025-09-27 00:53:56.640555 | orchestrator | Saturday 27 September 2025 00:52:54 +0000 (0:00:00.336) 0:09:21.860 **** 2025-09-27 00:53:56.640559 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640563 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640567 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640571 | orchestrator | 2025-09-27 00:53:56.640575 | orchestrator | TASK [ceph-handler : Set_fact handler_crash_status] **************************** 2025-09-27 00:53:56.640579 | orchestrator | Saturday 27 September 2025 00:52:54 +0000 (0:00:00.551) 0:09:22.411 **** 2025-09-27 00:53:56.640583 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640587 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640591 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640595 | orchestrator | 2025-09-27 00:53:56.640599 | orchestrator | TASK [ceph-handler : Set_fact handler_exporter_status] ************************* 2025-09-27 00:53:56.640604 | orchestrator | Saturday 27 September 2025 00:52:55 +0000 (0:00:00.337) 0:09:22.749 **** 2025-09-27 00:53:56.640608 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.640612 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.640616 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.640620 | orchestrator | 2025-09-27 00:53:56.640624 | orchestrator | TASK [ceph-rgw : Include common.yml] ******************************************* 2025-09-27 00:53:56.640628 | orchestrator | Saturday 27 September 2025 00:52:55 +0000 (0:00:00.592) 0:09:23.341 **** 2025-09-27 00:53:56.640632 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/common.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.640636 | orchestrator | 2025-09-27 00:53:56.640640 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2025-09-27 00:53:56.640644 | orchestrator | Saturday 27 September 2025 00:52:56 +0000 (0:00:00.754) 0:09:24.095 **** 2025-09-27 00:53:56.640648 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.640653 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-27 00:53:56.640657 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-27 00:53:56.640661 | orchestrator | 2025-09-27 00:53:56.640677 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2025-09-27 00:53:56.640682 | orchestrator | Saturday 27 September 2025 00:52:58 +0000 (0:00:02.329) 0:09:26.425 **** 2025-09-27 00:53:56.640686 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-27 00:53:56.640690 | orchestrator | skipping: [testbed-node-3] => (item=None)  2025-09-27 00:53:56.640694 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.640698 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-27 00:53:56.640702 | orchestrator | skipping: [testbed-node-4] => (item=None)  2025-09-27 00:53:56.640706 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.640710 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-27 00:53:56.640714 | orchestrator | skipping: [testbed-node-5] => (item=None)  2025-09-27 00:53:56.640718 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.640722 | orchestrator | 2025-09-27 00:53:56.640726 | orchestrator | TASK [ceph-rgw : Copy SSL certificate & key data to certificate path] ********** 2025-09-27 00:53:56.640731 | orchestrator | Saturday 27 September 2025 00:53:00 +0000 (0:00:01.243) 0:09:27.669 **** 2025-09-27 00:53:56.640735 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640739 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.640743 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.640747 | orchestrator | 2025-09-27 00:53:56.640751 | orchestrator | TASK [ceph-rgw : Include_tasks pre_requisite.yml] ****************************** 2025-09-27 00:53:56.640755 | orchestrator | Saturday 27 September 2025 00:53:00 +0000 (0:00:00.528) 0:09:28.198 **** 2025-09-27 00:53:56.640759 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/pre_requisite.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.640766 | orchestrator | 2025-09-27 00:53:56.640770 | orchestrator | TASK [ceph-rgw : Create rados gateway directories] ***************************** 2025-09-27 00:53:56.640774 | orchestrator | Saturday 27 September 2025 00:53:01 +0000 (0:00:00.573) 0:09:28.771 **** 2025-09-27 00:53:56.640779 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.640783 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.640787 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.640791 | orchestrator | 2025-09-27 00:53:56.640795 | orchestrator | TASK [ceph-rgw : Create rgw keyrings] ****************************************** 2025-09-27 00:53:56.640799 | orchestrator | Saturday 27 September 2025 00:53:01 +0000 (0:00:00.806) 0:09:29.577 **** 2025-09-27 00:53:56.640806 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.640810 | orchestrator | changed: [testbed-node-3 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2025-09-27 00:53:56.640814 | orchestrator | changed: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.640818 | orchestrator | changed: [testbed-node-4 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2025-09-27 00:53:56.640822 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.640826 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] if groups.get(mon_group_name, []) | length > 0 else 'localhost' }}] 2025-09-27 00:53:56.640830 | orchestrator | 2025-09-27 00:53:56.640835 | orchestrator | TASK [ceph-rgw : Get keys from monitors] *************************************** 2025-09-27 00:53:56.640839 | orchestrator | Saturday 27 September 2025 00:53:06 +0000 (0:00:05.042) 0:09:34.620 **** 2025-09-27 00:53:56.640843 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.640847 | orchestrator | ok: [testbed-node-3 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-27 00:53:56.640851 | orchestrator | ok: [testbed-node-4 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.640855 | orchestrator | ok: [testbed-node-4 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-27 00:53:56.640859 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:53:56.640863 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-27 00:53:56.640867 | orchestrator | 2025-09-27 00:53:56.640871 | orchestrator | TASK [ceph-rgw : Copy ceph key(s) if needed] *********************************** 2025-09-27 00:53:56.640875 | orchestrator | Saturday 27 September 2025 00:53:09 +0000 (0:00:02.399) 0:09:37.019 **** 2025-09-27 00:53:56.640879 | orchestrator | changed: [testbed-node-3] => (item=None) 2025-09-27 00:53:56.640883 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.640887 | orchestrator | changed: [testbed-node-4] => (item=None) 2025-09-27 00:53:56.640891 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.640895 | orchestrator | changed: [testbed-node-5] => (item=None) 2025-09-27 00:53:56.640899 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.640903 | orchestrator | 2025-09-27 00:53:56.640907 | orchestrator | TASK [ceph-rgw : Rgw pool creation tasks] ************************************** 2025-09-27 00:53:56.640911 | orchestrator | Saturday 27 September 2025 00:53:10 +0000 (0:00:01.321) 0:09:38.340 **** 2025-09-27 00:53:56.640916 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/rgw_create_pools.yml for testbed-node-3 2025-09-27 00:53:56.640920 | orchestrator | 2025-09-27 00:53:56.640924 | orchestrator | TASK [ceph-rgw : Create ec profile] ******************************************** 2025-09-27 00:53:56.640930 | orchestrator | Saturday 27 September 2025 00:53:10 +0000 (0:00:00.253) 0:09:38.594 **** 2025-09-27 00:53:56.640938 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640942 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640947 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640951 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640955 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640959 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.640963 | orchestrator | 2025-09-27 00:53:56.640967 | orchestrator | TASK [ceph-rgw : Set crush rule] *********************************************** 2025-09-27 00:53:56.640971 | orchestrator | Saturday 27 September 2025 00:53:12 +0000 (0:00:01.076) 0:09:39.670 **** 2025-09-27 00:53:56.640975 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640979 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640984 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640988 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640992 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}})  2025-09-27 00:53:56.640996 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.641000 | orchestrator | 2025-09-27 00:53:56.641004 | orchestrator | TASK [ceph-rgw : Create rgw pools] ********************************************* 2025-09-27 00:53:56.641008 | orchestrator | Saturday 27 September 2025 00:53:13 +0000 (0:00:01.040) 0:09:40.711 **** 2025-09-27 00:53:56.641012 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.data', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-27 00:53:56.641018 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.buckets.index', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-27 00:53:56.641023 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.control', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-27 00:53:56.641027 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.log', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-27 00:53:56.641031 | orchestrator | changed: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item={'key': 'default.rgw.meta', 'value': {'pg_num': 8, 'size': 3, 'type': 'replicated'}}) 2025-09-27 00:53:56.641035 | orchestrator | 2025-09-27 00:53:56.641039 | orchestrator | TASK [ceph-rgw : Include_tasks openstack-keystone.yml] ************************* 2025-09-27 00:53:56.641043 | orchestrator | Saturday 27 September 2025 00:53:43 +0000 (0:00:30.632) 0:10:11.343 **** 2025-09-27 00:53:56.641047 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.641052 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.641056 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.641060 | orchestrator | 2025-09-27 00:53:56.641064 | orchestrator | TASK [ceph-rgw : Include_tasks start_radosgw.yml] ****************************** 2025-09-27 00:53:56.641068 | orchestrator | Saturday 27 September 2025 00:53:44 +0000 (0:00:00.532) 0:10:11.876 **** 2025-09-27 00:53:56.641072 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.641079 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.641083 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.641098 | orchestrator | 2025-09-27 00:53:56.641103 | orchestrator | TASK [ceph-rgw : Include start_docker_rgw.yml] ********************************* 2025-09-27 00:53:56.641107 | orchestrator | Saturday 27 September 2025 00:53:44 +0000 (0:00:00.314) 0:10:12.191 **** 2025-09-27 00:53:56.641111 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/start_docker_rgw.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.641115 | orchestrator | 2025-09-27 00:53:56.641119 | orchestrator | TASK [ceph-rgw : Include_task systemd.yml] ************************************* 2025-09-27 00:53:56.641123 | orchestrator | Saturday 27 September 2025 00:53:45 +0000 (0:00:00.515) 0:10:12.706 **** 2025-09-27 00:53:56.641127 | orchestrator | included: /ansible/roles/ceph-rgw/tasks/systemd.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.641131 | orchestrator | 2025-09-27 00:53:56.641135 | orchestrator | TASK [ceph-rgw : Generate systemd unit file] *********************************** 2025-09-27 00:53:56.641140 | orchestrator | Saturday 27 September 2025 00:53:45 +0000 (0:00:00.827) 0:10:13.533 **** 2025-09-27 00:53:56.641144 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.641148 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.641152 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.641156 | orchestrator | 2025-09-27 00:53:56.641162 | orchestrator | TASK [ceph-rgw : Generate systemd ceph-radosgw target file] ******************** 2025-09-27 00:53:56.641167 | orchestrator | Saturday 27 September 2025 00:53:47 +0000 (0:00:01.266) 0:10:14.800 **** 2025-09-27 00:53:56.641171 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.641175 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.641179 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.641183 | orchestrator | 2025-09-27 00:53:56.641187 | orchestrator | TASK [ceph-rgw : Enable ceph-radosgw.target] *********************************** 2025-09-27 00:53:56.641191 | orchestrator | Saturday 27 September 2025 00:53:48 +0000 (0:00:01.146) 0:10:15.946 **** 2025-09-27 00:53:56.641195 | orchestrator | changed: [testbed-node-3] 2025-09-27 00:53:56.641199 | orchestrator | changed: [testbed-node-4] 2025-09-27 00:53:56.641204 | orchestrator | changed: [testbed-node-5] 2025-09-27 00:53:56.641208 | orchestrator | 2025-09-27 00:53:56.641212 | orchestrator | TASK [ceph-rgw : Systemd start rgw container] ********************************** 2025-09-27 00:53:56.641216 | orchestrator | Saturday 27 September 2025 00:53:50 +0000 (0:00:02.004) 0:10:17.951 **** 2025-09-27 00:53:56.641220 | orchestrator | changed: [testbed-node-3] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.13', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.641224 | orchestrator | changed: [testbed-node-4] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.14', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.641228 | orchestrator | changed: [testbed-node-5] => (item={'instance_name': 'rgw0', 'radosgw_address': '192.168.16.15', 'radosgw_frontend_port': 8081}) 2025-09-27 00:53:56.641232 | orchestrator | 2025-09-27 00:53:56.641236 | orchestrator | RUNNING HANDLER [ceph-handler : Make tempdir for scripts] ********************** 2025-09-27 00:53:56.641241 | orchestrator | Saturday 27 September 2025 00:53:52 +0000 (0:00:02.323) 0:10:20.274 **** 2025-09-27 00:53:56.641245 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.641249 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.641253 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.641257 | orchestrator | 2025-09-27 00:53:56.641261 | orchestrator | RUNNING HANDLER [ceph-handler : Rgws handler] ********************************** 2025-09-27 00:53:56.641265 | orchestrator | Saturday 27 September 2025 00:53:53 +0000 (0:00:00.558) 0:10:20.832 **** 2025-09-27 00:53:56.641269 | orchestrator | included: /ansible/roles/ceph-handler/tasks/handler_rgws.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:53:56.641273 | orchestrator | 2025-09-27 00:53:56.641277 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called before restart] ******** 2025-09-27 00:53:56.641282 | orchestrator | Saturday 27 September 2025 00:53:53 +0000 (0:00:00.521) 0:10:21.354 **** 2025-09-27 00:53:56.641289 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.641293 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.641297 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.641301 | orchestrator | 2025-09-27 00:53:56.641305 | orchestrator | RUNNING HANDLER [ceph-handler : Copy rgw restart script] *********************** 2025-09-27 00:53:56.641312 | orchestrator | Saturday 27 September 2025 00:53:54 +0000 (0:00:00.295) 0:10:21.649 **** 2025-09-27 00:53:56.641316 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.641320 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:53:56.641324 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:53:56.641329 | orchestrator | 2025-09-27 00:53:56.641333 | orchestrator | RUNNING HANDLER [ceph-handler : Restart ceph rgw daemon(s)] ******************** 2025-09-27 00:53:56.641337 | orchestrator | Saturday 27 September 2025 00:53:54 +0000 (0:00:00.570) 0:10:22.220 **** 2025-09-27 00:53:56.641341 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:53:56.641345 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:53:56.641349 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:53:56.641353 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:53:56.641357 | orchestrator | 2025-09-27 00:53:56.641361 | orchestrator | RUNNING HANDLER [ceph-handler : Set _rgw_handler_called after restart] ********* 2025-09-27 00:53:56.641365 | orchestrator | Saturday 27 September 2025 00:53:55 +0000 (0:00:00.641) 0:10:22.862 **** 2025-09-27 00:53:56.641370 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:53:56.641374 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:53:56.641378 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:53:56.641382 | orchestrator | 2025-09-27 00:53:56.641386 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:53:56.641390 | orchestrator | testbed-node-0 : ok=141  changed=36  unreachable=0 failed=0 skipped=135  rescued=0 ignored=0 2025-09-27 00:53:56.641394 | orchestrator | testbed-node-1 : ok=127  changed=32  unreachable=0 failed=0 skipped=120  rescued=0 ignored=0 2025-09-27 00:53:56.641398 | orchestrator | testbed-node-2 : ok=134  changed=33  unreachable=0 failed=0 skipped=119  rescued=0 ignored=0 2025-09-27 00:53:56.641403 | orchestrator | testbed-node-3 : ok=186  changed=44  unreachable=0 failed=0 skipped=152  rescued=0 ignored=0 2025-09-27 00:53:56.641407 | orchestrator | testbed-node-4 : ok=175  changed=40  unreachable=0 failed=0 skipped=123  rescued=0 ignored=0 2025-09-27 00:53:56.641411 | orchestrator | testbed-node-5 : ok=177  changed=41  unreachable=0 failed=0 skipped=121  rescued=0 ignored=0 2025-09-27 00:53:56.641415 | orchestrator | 2025-09-27 00:53:56.641419 | orchestrator | 2025-09-27 00:53:56.641423 | orchestrator | 2025-09-27 00:53:56.641427 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:53:56.641432 | orchestrator | Saturday 27 September 2025 00:53:55 +0000 (0:00:00.240) 0:10:23.103 **** 2025-09-27 00:53:56.641437 | orchestrator | =============================================================================== 2025-09-27 00:53:56.641442 | orchestrator | ceph-osd : Use ceph-volume to create osds ------------------------------ 41.71s 2025-09-27 00:53:56.641446 | orchestrator | ceph-container-common : Pulling Ceph container image ------------------- 41.66s 2025-09-27 00:53:56.641450 | orchestrator | ceph-mgr : Wait for all mgr to be up ----------------------------------- 36.40s 2025-09-27 00:53:56.641454 | orchestrator | ceph-rgw : Create rgw pools -------------------------------------------- 30.63s 2025-09-27 00:53:56.641458 | orchestrator | ceph-mon : Set cluster configs ----------------------------------------- 15.25s 2025-09-27 00:53:56.641462 | orchestrator | ceph-osd : Wait for all osd to be up ----------------------------------- 12.83s 2025-09-27 00:53:56.641466 | orchestrator | ceph-mgr : Create ceph mgr keyring(s) on a mon node -------------------- 10.93s 2025-09-27 00:53:56.641473 | orchestrator | ceph-mon : Fetch ceph initial keys -------------------------------------- 9.07s 2025-09-27 00:53:56.641478 | orchestrator | ceph-mds : Create filesystem pools -------------------------------------- 8.85s 2025-09-27 00:53:56.641482 | orchestrator | ceph-config : Create ceph initial directories --------------------------- 6.91s 2025-09-27 00:53:56.641486 | orchestrator | ceph-mgr : Disable ceph mgr enabled modules ----------------------------- 6.45s 2025-09-27 00:53:56.641490 | orchestrator | ceph-rgw : Create rgw keyrings ------------------------------------------ 5.04s 2025-09-27 00:53:56.641494 | orchestrator | ceph-mgr : Add modules to ceph-mgr -------------------------------------- 4.85s 2025-09-27 00:53:56.641498 | orchestrator | ceph-facts : Set_fact _monitor_addresses - ipv4 ------------------------- 4.19s 2025-09-27 00:53:56.641502 | orchestrator | ceph-crash : Create client.crash keyring -------------------------------- 4.13s 2025-09-27 00:53:56.641506 | orchestrator | ceph-container-common : Get ceph version -------------------------------- 3.81s 2025-09-27 00:53:56.641510 | orchestrator | ceph-mds : Create ceph filesystem --------------------------------------- 3.71s 2025-09-27 00:53:56.641514 | orchestrator | ceph-mon : Copy admin keyring over to mons ------------------------------ 3.62s 2025-09-27 00:53:56.641518 | orchestrator | ceph-osd : Systemd start osd -------------------------------------------- 3.46s 2025-09-27 00:53:56.641523 | orchestrator | ceph-crash : Start the ceph-crash service ------------------------------- 3.42s 2025-09-27 00:53:56.641527 | orchestrator | 2025-09-27 00:53:56.641531 | orchestrator | 2025-09-27 00:53:56.641535 | orchestrator | PLAY [Set kolla_action_mariadb] ************************************************ 2025-09-27 00:53:56.641539 | orchestrator | 2025-09-27 00:53:56.641543 | orchestrator | TASK [Inform the user about the following task] ******************************** 2025-09-27 00:53:56.641547 | orchestrator | Saturday 27 September 2025 00:49:35 +0000 (0:00:00.102) 0:00:00.102 **** 2025-09-27 00:53:56.641551 | orchestrator | ok: [localhost] => { 2025-09-27 00:53:56.641558 | orchestrator |  "msg": "The task 'Check MariaDB service' fails if the MariaDB service has not yet been deployed. This is fine." 2025-09-27 00:53:56.641562 | orchestrator | } 2025-09-27 00:53:56.641566 | orchestrator | 2025-09-27 00:53:56.641571 | orchestrator | TASK [Check MariaDB service] *************************************************** 2025-09-27 00:53:56.641575 | orchestrator | Saturday 27 September 2025 00:49:35 +0000 (0:00:00.050) 0:00:00.153 **** 2025-09-27 00:53:56.641579 | orchestrator | fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 2, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.9:3306"} 2025-09-27 00:53:56.641583 | orchestrator | ...ignoring 2025-09-27 00:53:56.641587 | orchestrator | 2025-09-27 00:53:56.641592 | orchestrator | TASK [Set kolla_action_mariadb = upgrade if MariaDB is already running] ******** 2025-09-27 00:53:56.641596 | orchestrator | Saturday 27 September 2025 00:49:38 +0000 (0:00:02.831) 0:00:02.984 **** 2025-09-27 00:53:56.641600 | orchestrator | skipping: [localhost] 2025-09-27 00:53:56.641604 | orchestrator | 2025-09-27 00:53:56.641608 | orchestrator | TASK [Set kolla_action_mariadb = kolla_action_ng] ****************************** 2025-09-27 00:53:56.641612 | orchestrator | Saturday 27 September 2025 00:49:38 +0000 (0:00:00.049) 0:00:03.033 **** 2025-09-27 00:53:56.641616 | orchestrator | ok: [localhost] 2025-09-27 00:53:56.641620 | orchestrator | 2025-09-27 00:53:56.641624 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:53:56.641628 | orchestrator | 2025-09-27 00:53:56.641633 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:53:56.641637 | orchestrator | Saturday 27 September 2025 00:49:38 +0000 (0:00:00.184) 0:00:03.217 **** 2025-09-27 00:53:56.641641 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.641645 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.641649 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.641653 | orchestrator | 2025-09-27 00:53:56.641657 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:53:56.641661 | orchestrator | Saturday 27 September 2025 00:49:39 +0000 (0:00:00.295) 0:00:03.512 **** 2025-09-27 00:53:56.641668 | orchestrator | ok: [testbed-node-0] => (item=enable_mariadb_True) 2025-09-27 00:53:56.641672 | orchestrator | ok: [testbed-node-1] => (item=enable_mariadb_True) 2025-09-27 00:53:56.641676 | orchestrator | ok: [testbed-node-2] => (item=enable_mariadb_True) 2025-09-27 00:53:56.641680 | orchestrator | 2025-09-27 00:53:56.641685 | orchestrator | PLAY [Apply role mariadb] ****************************************************** 2025-09-27 00:53:56.641689 | orchestrator | 2025-09-27 00:53:56.641693 | orchestrator | TASK [mariadb : Group MariaDB hosts based on shards] *************************** 2025-09-27 00:53:56.641697 | orchestrator | Saturday 27 September 2025 00:49:39 +0000 (0:00:00.584) 0:00:04.097 **** 2025-09-27 00:53:56.641701 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-0) 2025-09-27 00:53:56.641705 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-1) 2025-09-27 00:53:56.641709 | orchestrator | ok: [testbed-node-0] => (item=testbed-node-2) 2025-09-27 00:53:56.641713 | orchestrator | 2025-09-27 00:53:56.641717 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2025-09-27 00:53:56.641721 | orchestrator | Saturday 27 September 2025 00:49:40 +0000 (0:00:00.507) 0:00:04.604 **** 2025-09-27 00:53:56.641727 | orchestrator | included: /ansible/roles/mariadb/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.641732 | orchestrator | 2025-09-27 00:53:56.641736 | orchestrator | TASK [mariadb : Ensuring config directories exist] ***************************** 2025-09-27 00:53:56.641740 | orchestrator | Saturday 27 September 2025 00:49:40 +0000 (0:00:00.590) 0:00:05.194 **** 2025-09-27 00:53:56.641748 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.641755 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.641766 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.641771 | orchestrator | 2025-09-27 00:53:56.641775 | orchestrator | TASK [mariadb : Ensuring database backup config directory exists] ************** 2025-09-27 00:53:56.641779 | orchestrator | Saturday 27 September 2025 00:49:44 +0000 (0:00:03.645) 0:00:08.840 **** 2025-09-27 00:53:56.641783 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.641787 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.641792 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.641796 | orchestrator | 2025-09-27 00:53:56.641800 | orchestrator | TASK [mariadb : Copying over my.cnf for mariabackup] *************************** 2025-09-27 00:53:56.641804 | orchestrator | Saturday 27 September 2025 00:49:44 +0000 (0:00:00.667) 0:00:09.508 **** 2025-09-27 00:53:56.641812 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.641816 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.641820 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.641824 | orchestrator | 2025-09-27 00:53:56.641828 | orchestrator | TASK [mariadb : Copying over config.json files for services] ******************* 2025-09-27 00:53:56.641832 | orchestrator | Saturday 27 September 2025 00:49:46 +0000 (0:00:01.195) 0:00:10.703 **** 2025-09-27 00:53:56.641840 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.641848 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.641855 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.641863 | orchestrator | 2025-09-27 00:53:56.641867 | orchestrator | TASK [mariadb : Copying over config.json files for mariabackup] **************** 2025-09-27 00:53:56.641871 | orchestrator | Saturday 27 September 2025 00:49:49 +0000 (0:00:03.314) 0:00:14.018 **** 2025-09-27 00:53:56.641875 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.641879 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.641884 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.641888 | orchestrator | 2025-09-27 00:53:56.641892 | orchestrator | TASK [mariadb : Copying over galera.cnf] *************************************** 2025-09-27 00:53:56.641896 | orchestrator | Saturday 27 September 2025 00:49:50 +0000 (0:00:01.419) 0:00:15.437 **** 2025-09-27 00:53:56.641902 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.641906 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.641910 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.641914 | orchestrator | 2025-09-27 00:53:56.641918 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2025-09-27 00:53:56.641923 | orchestrator | Saturday 27 September 2025 00:49:54 +0000 (0:00:03.945) 0:00:19.383 **** 2025-09-27 00:53:56.641927 | orchestrator | included: /ansible/roles/mariadb/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.641931 | orchestrator | 2025-09-27 00:53:56.641935 | orchestrator | TASK [service-cert-copy : mariadb | Copying over extra CA certificates] ******** 2025-09-27 00:53:56.641939 | orchestrator | Saturday 27 September 2025 00:49:55 +0000 (0:00:00.668) 0:00:20.051 **** 2025-09-27 00:53:56.641946 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.641954 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.641961 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.641965 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.641970 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.641977 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.641981 | orchestrator | 2025-09-27 00:53:56.641988 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS certificate] *** 2025-09-27 00:53:56.641992 | orchestrator | Saturday 27 September 2025 00:49:58 +0000 (0:00:03.382) 0:00:23.433 **** 2025-09-27 00:53:56.641997 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.642001 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642008 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.642040 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642048 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.642053 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642057 | orchestrator | 2025-09-27 00:53:56.642061 | orchestrator | TASK [service-cert-copy : mariadb | Copying over backend internal TLS key] ***** 2025-09-27 00:53:56.642065 | orchestrator | Saturday 27 September 2025 00:50:02 +0000 (0:00:03.203) 0:00:26.637 **** 2025-09-27 00:53:56.642073 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.642080 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642107 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.642113 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642121 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}})  2025-09-27 00:53:56.642129 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642133 | orchestrator | 2025-09-27 00:53:56.642137 | orchestrator | TASK [mariadb : Check mariadb containers] ************************************** 2025-09-27 00:53:56.642141 | orchestrator | Saturday 27 September 2025 00:50:06 +0000 (0:00:03.927) 0:00:30.564 **** 2025-09-27 00:53:56.642148 | orchestrator | changed: [testbed-node-1] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.11', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.642156 | orchestrator | changed: [testbed-node-0] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.10', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.642170 | orchestrator | changed: [testbed-node-2] => (item={'key': 'mariadb', 'value': {'container_name': 'mariadb', 'group': 'mariadb_shard_0', 'enabled': True, 'image': 'registry.osism.tech/kolla/mariadb-server:2024.2', 'volumes': ['/etc/kolla/mariadb/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/hosts:/etc/hosts:ro', '/etc/timezone:/etc/timezone:ro', 'mariadb:/var/lib/mysql', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/clustercheck'], 'timeout': '30'}, 'environment': {'MYSQL_USERNAME': 'monitor', 'MYSQL_PASSWORD': 'iek7ooth9miesodoh2ongohcaachah0I', 'MYSQL_HOST': '192.168.16.12', 'AVAILABLE_WHEN_DONOR': '1'}, 'haproxy': {'mariadb': {'enabled': True, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s', ''], 'custom_member_list': [' server testbed-node-0 192.168.16.10:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 192.168.16.11:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 192.168.16.12:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}, 'mariadb_external_lb': {'enabled': False, 'mode': 'tcp', 'port': '3306', 'listen_port': '3306', 'frontend_tcp_extra': ['option clitcpka', 'timeout client 3600s'], 'backend_tcp_extra': ['option srvtcpka', 'timeout server 3600s'], 'custom_member_list': [' server testbed-node-0 testbed-node-0:3306 check port 3306 inter 2000 rise 2 fall 5', ' server testbed-node-1 testbed-node-1:3306 check port 3306 inter 2000 rise 2 fall 5 backup', ' server testbed-node-2 testbed-node-2:3306 check port 3306 inter 2000 rise 2 fall 5 backup', '']}}}}) 2025-09-27 00:53:56.642174 | orchestrator | 2025-09-27 00:53:56.642178 | orchestrator | TASK [mariadb : Create MariaDB volume] ***************************************** 2025-09-27 00:53:56.642182 | orchestrator | Saturday 27 September 2025 00:50:08 +0000 (0:00:02.822) 0:00:33.386 **** 2025-09-27 00:53:56.642187 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.642191 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.642195 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.642199 | orchestrator | 2025-09-27 00:53:56.642203 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB volume availability] ************* 2025-09-27 00:53:56.642207 | orchestrator | Saturday 27 September 2025 00:50:09 +0000 (0:00:00.766) 0:00:34.153 **** 2025-09-27 00:53:56.642211 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642215 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.642219 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.642223 | orchestrator | 2025-09-27 00:53:56.642227 | orchestrator | TASK [mariadb : Establish whether the cluster has already existed] ************* 2025-09-27 00:53:56.642231 | orchestrator | Saturday 27 September 2025 00:50:10 +0000 (0:00:00.534) 0:00:34.687 **** 2025-09-27 00:53:56.642235 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642240 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.642244 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.642248 | orchestrator | 2025-09-27 00:53:56.642252 | orchestrator | TASK [mariadb : Check MariaDB service port liveness] *************************** 2025-09-27 00:53:56.642256 | orchestrator | Saturday 27 September 2025 00:50:10 +0000 (0:00:00.412) 0:00:35.100 **** 2025-09-27 00:53:56.642260 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.10:3306"} 2025-09-27 00:53:56.642264 | orchestrator | ...ignoring 2025-09-27 00:53:56.642269 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.11:3306"} 2025-09-27 00:53:56.642273 | orchestrator | ...ignoring 2025-09-27 00:53:56.642279 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"changed": false, "elapsed": 10, "msg": "Timeout when waiting for search string MariaDB in 192.168.16.12:3306"} 2025-09-27 00:53:56.642287 | orchestrator | ...ignoring 2025-09-27 00:53:56.642291 | orchestrator | 2025-09-27 00:53:56.642295 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service port liveness] *********** 2025-09-27 00:53:56.642299 | orchestrator | Saturday 27 September 2025 00:50:21 +0000 (0:00:10.916) 0:00:46.016 **** 2025-09-27 00:53:56.642303 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642307 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.642311 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.642315 | orchestrator | 2025-09-27 00:53:56.642320 | orchestrator | TASK [mariadb : Fail on existing but stopped cluster] ************************** 2025-09-27 00:53:56.642324 | orchestrator | Saturday 27 September 2025 00:50:21 +0000 (0:00:00.399) 0:00:46.416 **** 2025-09-27 00:53:56.642328 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642332 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642336 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642340 | orchestrator | 2025-09-27 00:53:56.642344 | orchestrator | TASK [mariadb : Check MariaDB service WSREP sync status] *********************** 2025-09-27 00:53:56.642348 | orchestrator | Saturday 27 September 2025 00:50:22 +0000 (0:00:00.600) 0:00:47.016 **** 2025-09-27 00:53:56.642352 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642356 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642360 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642365 | orchestrator | 2025-09-27 00:53:56.642369 | orchestrator | TASK [mariadb : Extract MariaDB service WSREP sync status] ********************* 2025-09-27 00:53:56.642373 | orchestrator | Saturday 27 September 2025 00:50:22 +0000 (0:00:00.463) 0:00:47.480 **** 2025-09-27 00:53:56.642377 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642381 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642385 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642389 | orchestrator | 2025-09-27 00:53:56.642393 | orchestrator | TASK [mariadb : Divide hosts by their MariaDB service WSREP sync status] ******* 2025-09-27 00:53:56.642397 | orchestrator | Saturday 27 September 2025 00:50:23 +0000 (0:00:00.387) 0:00:47.868 **** 2025-09-27 00:53:56.642401 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642405 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.642409 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.642414 | orchestrator | 2025-09-27 00:53:56.642418 | orchestrator | TASK [mariadb : Fail when MariaDB services are not synced across the whole cluster] *** 2025-09-27 00:53:56.642422 | orchestrator | Saturday 27 September 2025 00:50:23 +0000 (0:00:00.420) 0:00:48.288 **** 2025-09-27 00:53:56.642426 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642430 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642434 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642438 | orchestrator | 2025-09-27 00:53:56.642442 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2025-09-27 00:53:56.642446 | orchestrator | Saturday 27 September 2025 00:50:24 +0000 (0:00:00.569) 0:00:48.857 **** 2025-09-27 00:53:56.642450 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642454 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642461 | orchestrator | included: /ansible/roles/mariadb/tasks/bootstrap_cluster.yml for testbed-node-0 2025-09-27 00:53:56.642465 | orchestrator | 2025-09-27 00:53:56.642469 | orchestrator | TASK [mariadb : Running MariaDB bootstrap container] *************************** 2025-09-27 00:53:56.642474 | orchestrator | Saturday 27 September 2025 00:50:24 +0000 (0:00:00.382) 0:00:49.240 **** 2025-09-27 00:53:56.642478 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.642482 | orchestrator | 2025-09-27 00:53:56.642486 | orchestrator | TASK [mariadb : Store bootstrap host name into facts] ************************** 2025-09-27 00:53:56.642490 | orchestrator | Saturday 27 September 2025 00:50:35 +0000 (0:00:10.393) 0:00:59.634 **** 2025-09-27 00:53:56.642494 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642498 | orchestrator | 2025-09-27 00:53:56.642502 | orchestrator | TASK [mariadb : include_tasks] ************************************************* 2025-09-27 00:53:56.642509 | orchestrator | Saturday 27 September 2025 00:50:35 +0000 (0:00:00.134) 0:00:59.768 **** 2025-09-27 00:53:56.642514 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642518 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642522 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642526 | orchestrator | 2025-09-27 00:53:56.642530 | orchestrator | RUNNING HANDLER [mariadb : Starting first MariaDB container] ******************* 2025-09-27 00:53:56.642534 | orchestrator | Saturday 27 September 2025 00:50:36 +0000 (0:00:00.925) 0:01:00.694 **** 2025-09-27 00:53:56.642538 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.642542 | orchestrator | 2025-09-27 00:53:56.642546 | orchestrator | RUNNING HANDLER [mariadb : Wait for first MariaDB service port liveness] ******* 2025-09-27 00:53:56.642550 | orchestrator | Saturday 27 September 2025 00:50:43 +0000 (0:00:07.042) 0:01:07.736 **** 2025-09-27 00:53:56.642554 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642558 | orchestrator | 2025-09-27 00:53:56.642563 | orchestrator | RUNNING HANDLER [mariadb : Wait for first MariaDB service to sync WSREP] ******* 2025-09-27 00:53:56.642567 | orchestrator | Saturday 27 September 2025 00:50:44 +0000 (0:00:01.644) 0:01:09.381 **** 2025-09-27 00:53:56.642571 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642575 | orchestrator | 2025-09-27 00:53:56.642579 | orchestrator | RUNNING HANDLER [mariadb : Ensure MariaDB is running normally on bootstrap host] *** 2025-09-27 00:53:56.642583 | orchestrator | Saturday 27 September 2025 00:50:47 +0000 (0:00:02.336) 0:01:11.717 **** 2025-09-27 00:53:56.642587 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.642591 | orchestrator | 2025-09-27 00:53:56.642595 | orchestrator | RUNNING HANDLER [mariadb : Restart MariaDB on existing cluster members] ******** 2025-09-27 00:53:56.642599 | orchestrator | Saturday 27 September 2025 00:50:47 +0000 (0:00:00.142) 0:01:11.859 **** 2025-09-27 00:53:56.642604 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642607 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642611 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642615 | orchestrator | 2025-09-27 00:53:56.642618 | orchestrator | RUNNING HANDLER [mariadb : Start MariaDB on new nodes] ************************* 2025-09-27 00:53:56.642622 | orchestrator | Saturday 27 September 2025 00:50:47 +0000 (0:00:00.311) 0:01:12.171 **** 2025-09-27 00:53:56.642626 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:53:56.642630 | orchestrator | [WARNING]: Could not match supplied host pattern, ignoring: mariadb_restart 2025-09-27 00:53:56.642635 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.642639 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.642643 | orchestrator | 2025-09-27 00:53:56.642646 | orchestrator | PLAY [Restart mariadb services] ************************************************ 2025-09-27 00:53:56.642650 | orchestrator | skipping: no hosts matched 2025-09-27 00:53:56.642654 | orchestrator | 2025-09-27 00:53:56.642658 | orchestrator | PLAY [Start mariadb services] ************************************************** 2025-09-27 00:53:56.642661 | orchestrator | 2025-09-27 00:53:56.642665 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2025-09-27 00:53:56.642669 | orchestrator | Saturday 27 September 2025 00:50:48 +0000 (0:00:00.508) 0:01:12.679 **** 2025-09-27 00:53:56.642673 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:53:56.642676 | orchestrator | 2025-09-27 00:53:56.642680 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2025-09-27 00:53:56.642684 | orchestrator | Saturday 27 September 2025 00:51:06 +0000 (0:00:18.190) 0:01:30.870 **** 2025-09-27 00:53:56.642687 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.642691 | orchestrator | 2025-09-27 00:53:56.642695 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2025-09-27 00:53:56.642699 | orchestrator | Saturday 27 September 2025 00:51:27 +0000 (0:00:21.517) 0:01:52.387 **** 2025-09-27 00:53:56.642702 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:53:56.642706 | orchestrator | 2025-09-27 00:53:56.642710 | orchestrator | PLAY [Start mariadb services] ************************************************** 2025-09-27 00:53:56.642713 | orchestrator | 2025-09-27 00:53:56.642717 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2025-09-27 00:53:56.642724 | orchestrator | Saturday 27 September 2025 00:51:30 +0000 (0:00:02.314) 0:01:54.702 **** 2025-09-27 00:53:56.642727 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:53:56.642731 | orchestrator | 2025-09-27 00:53:56.642735 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2025-09-27 00:53:56.642739 | orchestrator | Saturday 27 September 2025 00:51:48 +0000 (0:00:18.654) 0:02:13.356 **** 2025-09-27 00:53:56.642742 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.642746 | orchestrator | 2025-09-27 00:53:56.642750 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2025-09-27 00:53:56.642753 | orchestrator | Saturday 27 September 2025 00:52:09 +0000 (0:00:20.591) 0:02:33.948 **** 2025-09-27 00:53:56.642757 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:53:56.642761 | orchestrator | 2025-09-27 00:53:56.642765 | orchestrator | PLAY [Restart bootstrap mariadb service] *************************************** 2025-09-27 00:53:56.642768 | orchestrator | 2025-09-27 00:53:56.642772 | orchestrator | TASK [mariadb : Restart MariaDB container] ************************************* 2025-09-27 00:53:56.642776 | orchestrator | Saturday 27 September 2025 00:52:11 +0000 (0:00:02.440) 0:02:36.389 **** 2025-09-27 00:53:56.642779 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:53:56.642783 | orchestrator | 2025-09-27 00:53:56.642787 | orchestrator | TASK [mariadb : Wait for MariaDB service port liveness] ************************ 2025-09-27 00:53:56.642791 | orchestrator | Saturday 27 September 2025 00:52:28 +0000 (0:00:16.644) 0:02:53.033 **** 2025-09-27 00:53:56.642794 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642798 | orchestrator | 2025-09-27 00:53:56.642804 | orchestrator | TASK [mariadb : Wait for MariaDB service to sync WSREP] ************************ 2025-09-27 00:53:56.642808 | orchestrator | Saturday 27 September 2025 00:52:29 +0000 (0:00:00.551) 0:02:53.585 **** 2025-09-27 00:53:56.642812 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:53:56.642816 | orchestrator | 2025-09-27 00:53:56.642819 | orchestrator | PLAY [Apply mariadb post-configuration] **************************************** 2025-09-27 00:53:56.642823 | orchestrator | 2025-09-27 00:53:56.642827 | orchestrator | TASK [Include mariadb post-deploy.yml] ***************************************** 2025-09-27 00:53:56.642830 | orchestrator | Saturday 27 September 2025 00:52:31 +0000 (0:00:02.581) 0:02:56.167 **** 2025-09-27 00:53:56.642834 | orchestrator | included: mariadb for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:53:56.642838 | orchestrator | 2025-09-27 00:53:56.642842 | orchestrator | TASK [mariadb : Creating shard root mysql user] ******************************** 2025-09-27 00:53:56.642845 | orchestrator | Saturday 27 September 2025 00:52:32 +0000 (0:00:00.539) 0:02:56.706 **** 2025-09-27 00:53:56.642849 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642853 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642856 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:53:56.642860 | orchestrator | 2025-09-27 00:53:56.642864 | orchestrator | TASK [mariadb : Creating mysql monitor user] *********************************** 2025-09-27 00:53:56.642868 | orchestrator | Saturday 27 September 2025 00:52:33 +0000 (0:00:00.860) 0:02:57.567 **** 2025-09-27 00:53:56.642871 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642875 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642879 | orchestrator | 2025-09-27 00:53:56.642882 | orchestrator | TASK [mariadb : Creating database backup user and setting permissions] ********* 2025-09-27 00:53:56.642886 | orchestrator | Saturday 27 September 2025 00:52:33 +0000 (0:00:00.213) 0:02:57.781 **** 2025-09-27 00:53:56.642890 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642894 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642897 | orchestrator | 2025-09-27 00:53:56.642901 | orchestrator | TASK [mariadb : Granting permissions on Mariabackup database to backup user] *** 2025-09-27 00:53:56.642905 | orchestrator | Saturday 27 September 2025 00:52:33 +0000 (0:00:00.357) 0:02:58.139 **** 2025-09-27 00:53:56.642909 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:53:56.642912 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:53:56.642919 | orchestrator | 2025-09-27 00:53:56.642923 | orchestrator | TASK [mariadb : Wait for MariaDB service to be ready through VIP] ************** 2025-09-27 00:53:56.642927 | orchestrator | Saturday 27 September 2025 00:52:33 +0000 (0:00:00.224) 0:02:58.363 **** 2025-09-27 00:53:56.642931 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (6 retries left). 2025-09-27 00:53:56.642935 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (6 retries left). 2025-09-27 00:53:56.642940 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (5 retries left). 2025-09-27 00:53:56.642944 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (5 retries left). 2025-09-27 00:53:56.642948 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (4 retries left). 2025-09-27 00:53:56.642951 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (4 retries left). 2025-09-27 00:53:56.642955 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (3 retries left). 2025-09-27 00:53:56.642959 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (3 retries left). 2025-09-27 00:53:56.642963 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (2 retries left). 2025-09-27 00:53:56.642966 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (2 retries left). 2025-09-27 00:53:56.642970 | orchestrator | FAILED - RETRYING: [testbed-node-2]: Wait for MariaDB service to be ready through VIP (1 retries left). 2025-09-27 00:53:56.642974 | orchestrator | FAILED - RETRYING: [testbed-node-1]: Wait for MariaDB service to be ready through VIP (1 retries left). 2025-09-27 00:53:56.642978 | orchestrator | fatal: [testbed-node-2]: FAILED! => {"attempts": 6, "changed": false, "cmd": ["docker", "exec", "mariadb", "mysql", "-h", "api-int.testbed.osism.xyz", "-P", "3306", "-u", "root_shard_0", "-ppassword", "-e", "show databases;"], "delta": "0:00:03.246074", "end": "2025-09-27 00:53:52.167631", "msg": "non-zero return code", "rc": 1, "start": "2025-09-27 00:53:48.921557", "stderr": "ERROR 2002 (HY000): Can't connect to server on 'api-int.testbed.osism.xyz' (115)", "stderr_lines": ["ERROR 2002 (HY000): Can't connect to server on 'api-int.testbed.osism.xyz' (115)"], "stdout": "", "stdout_lines": []} 2025-09-27 00:53:56.642983 | orchestrator | fatal: [testbed-node-1]: FAILED! => {"attempts": 6, "changed": false, "cmd": ["docker", "exec", "mariadb", "mysql", "-h", "api-int.testbed.osism.xyz", "-P", "3306", "-u", "root_shard_0", "-ppassword", "-e", "show databases;"], "delta": "0:00:02.140654", "end": "2025-09-27 00:53:55.960086", "msg": "non-zero return code", "rc": 1, "start": "2025-09-27 00:53:53.819432", "stderr": "ERROR 2002 (HY000): Can't connect to server on 'api-int.testbed.osism.xyz' (115)", "stderr_lines": ["ERROR 2002 (HY000): Can't connect to server on 'api-int.testbed.osism.xyz' (115)"], "stdout": "", "stdout_lines": []} 2025-09-27 00:53:56.642987 | orchestrator | 2025-09-27 00:53:56.642991 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:53:56.642995 | orchestrator | localhost : ok=3  changed=0 unreachable=0 failed=0 skipped=1  rescued=0 ignored=1  2025-09-27 00:53:56.642999 | orchestrator | testbed-node-0 : ok=29  changed=12  unreachable=0 failed=1  skipped=10  rescued=0 ignored=1  2025-09-27 00:53:56.643004 | orchestrator | testbed-node-1 : ok=19  changed=7  unreachable=0 failed=1  skipped=17  rescued=0 ignored=1  2025-09-27 00:53:56.643007 | orchestrator | testbed-node-2 : ok=19  changed=7  unreachable=0 failed=1  skipped=17  rescued=0 ignored=1  2025-09-27 00:53:56.643014 | orchestrator | 2025-09-27 00:53:56.643018 | orchestrator | 2025-09-27 00:53:56.643022 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:53:56.643026 | orchestrator | Saturday 27 September 2025 00:53:56 +0000 (0:01:22.164) 0:04:20.528 **** 2025-09-27 00:53:56.643030 | orchestrator | =============================================================================== 2025-09-27 00:53:56.643060 | orchestrator | mariadb : Wait for MariaDB service to be ready through VIP ------------- 82.16s 2025-09-27 00:53:56.643068 | orchestrator | mariadb : Wait for MariaDB service port liveness ----------------------- 42.11s 2025-09-27 00:53:56.643072 | orchestrator | mariadb : Restart MariaDB container ------------------------------------ 36.85s 2025-09-27 00:53:56.643076 | orchestrator | mariadb : Restart MariaDB container ------------------------------------ 16.64s 2025-09-27 00:53:56.643080 | orchestrator | mariadb : Check MariaDB service port liveness -------------------------- 10.92s 2025-09-27 00:53:56.643083 | orchestrator | mariadb : Running MariaDB bootstrap container -------------------------- 10.39s 2025-09-27 00:53:56.643097 | orchestrator | mariadb : Starting first MariaDB container ------------------------------ 7.04s 2025-09-27 00:53:56.643100 | orchestrator | mariadb : Wait for MariaDB service to sync WSREP ------------------------ 4.76s 2025-09-27 00:53:56.643104 | orchestrator | mariadb : Copying over galera.cnf --------------------------------------- 3.95s 2025-09-27 00:53:56.643110 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS key ----- 3.93s 2025-09-27 00:53:56.643114 | orchestrator | mariadb : Ensuring config directories exist ----------------------------- 3.65s 2025-09-27 00:53:56.643118 | orchestrator | service-cert-copy : mariadb | Copying over extra CA certificates -------- 3.38s 2025-09-27 00:53:56.643122 | orchestrator | mariadb : Copying over config.json files for services ------------------- 3.31s 2025-09-27 00:53:56.643126 | orchestrator | service-cert-copy : mariadb | Copying over backend internal TLS certificate --- 3.20s 2025-09-27 00:53:56.643129 | orchestrator | Check MariaDB service --------------------------------------------------- 2.83s 2025-09-27 00:53:56.643133 | orchestrator | mariadb : Check mariadb containers -------------------------------------- 2.82s 2025-09-27 00:53:56.643137 | orchestrator | mariadb : Wait for MariaDB service to sync WSREP ------------------------ 2.58s 2025-09-27 00:53:56.643141 | orchestrator | mariadb : Wait for first MariaDB service to sync WSREP ------------------ 2.34s 2025-09-27 00:53:56.643144 | orchestrator | mariadb : Wait for first MariaDB service port liveness ------------------ 1.64s 2025-09-27 00:53:56.643148 | orchestrator | mariadb : Copying over config.json files for mariabackup ---------------- 1.42s 2025-09-27 00:53:56.643152 | orchestrator | 2025-09-27 00:53:56 | INFO  | Task 4fa01eb7-5be3-437c-b4e5-8b90d9b0cd67 is in state SUCCESS 2025-09-27 00:53:56.643156 | orchestrator | 2025-09-27 00:53:56 | INFO  | Wait 1 second(s) until refresh of running tasks 2025-09-27 00:53:59.664949 | orchestrator | 2025-09-27 00:53:59 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:53:59.665673 | orchestrator | 2025-09-27 00:53:59 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:53:59.666703 | orchestrator | 2025-09-27 00:53:59 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:53:59.666730 | orchestrator | 2025-09-27 00:53:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:02.698676 | orchestrator | 2025-09-27 00:54:02 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:02.700044 | orchestrator | 2025-09-27 00:54:02 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:02.703031 | orchestrator | 2025-09-27 00:54:02 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:02.703063 | orchestrator | 2025-09-27 00:54:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:05.738803 | orchestrator | 2025-09-27 00:54:05 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:05.739064 | orchestrator | 2025-09-27 00:54:05 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:05.739911 | orchestrator | 2025-09-27 00:54:05 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:05.739942 | orchestrator | 2025-09-27 00:54:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:08.778362 | orchestrator | 2025-09-27 00:54:08 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:08.778720 | orchestrator | 2025-09-27 00:54:08 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:08.780568 | orchestrator | 2025-09-27 00:54:08 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:08.780597 | orchestrator | 2025-09-27 00:54:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:11.811452 | orchestrator | 2025-09-27 00:54:11 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:11.813719 | orchestrator | 2025-09-27 00:54:11 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:11.814493 | orchestrator | 2025-09-27 00:54:11 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:11.816927 | orchestrator | 2025-09-27 00:54:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:14.854255 | orchestrator | 2025-09-27 00:54:14 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:14.854776 | orchestrator | 2025-09-27 00:54:14 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:14.855530 | orchestrator | 2025-09-27 00:54:14 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:14.855554 | orchestrator | 2025-09-27 00:54:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:17.892601 | orchestrator | 2025-09-27 00:54:17 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:17.893383 | orchestrator | 2025-09-27 00:54:17 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:17.895293 | orchestrator | 2025-09-27 00:54:17 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:17.895319 | orchestrator | 2025-09-27 00:54:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:20.947015 | orchestrator | 2025-09-27 00:54:20 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:20.947139 | orchestrator | 2025-09-27 00:54:20 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:20.947155 | orchestrator | 2025-09-27 00:54:20 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:20.947167 | orchestrator | 2025-09-27 00:54:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:23.960974 | orchestrator | 2025-09-27 00:54:23 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:23.961320 | orchestrator | 2025-09-27 00:54:23 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:23.963234 | orchestrator | 2025-09-27 00:54:23 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:23.963266 | orchestrator | 2025-09-27 00:54:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:27.014537 | orchestrator | 2025-09-27 00:54:27 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state STARTED 2025-09-27 00:54:27.017499 | orchestrator | 2025-09-27 00:54:27 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:27.018460 | orchestrator | 2025-09-27 00:54:27 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:27.018690 | orchestrator | 2025-09-27 00:54:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:30.054382 | orchestrator | 2025-09-27 00:54:30 | INFO  | Task dadd21b2-134e-459b-ba90-b4d2ad4ed941 is in state SUCCESS 2025-09-27 00:54:30.055301 | orchestrator | 2025-09-27 00:54:30.055426 | orchestrator | 2025-09-27 00:54:30.055669 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:54:30.055685 | orchestrator | 2025-09-27 00:54:30.055696 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:54:30.055707 | orchestrator | Saturday 27 September 2025 00:54:00 +0000 (0:00:00.237) 0:00:00.237 **** 2025-09-27 00:54:30.055719 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.055730 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.055741 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.055752 | orchestrator | 2025-09-27 00:54:30.055780 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:54:30.055792 | orchestrator | Saturday 27 September 2025 00:54:00 +0000 (0:00:00.266) 0:00:00.504 **** 2025-09-27 00:54:30.055803 | orchestrator | ok: [testbed-node-0] => (item=enable_horizon_True) 2025-09-27 00:54:30.055814 | orchestrator | ok: [testbed-node-1] => (item=enable_horizon_True) 2025-09-27 00:54:30.055825 | orchestrator | ok: [testbed-node-2] => (item=enable_horizon_True) 2025-09-27 00:54:30.055836 | orchestrator | 2025-09-27 00:54:30.055846 | orchestrator | PLAY [Apply role horizon] ****************************************************** 2025-09-27 00:54:30.055857 | orchestrator | 2025-09-27 00:54:30.055868 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-27 00:54:30.055879 | orchestrator | Saturday 27 September 2025 00:54:00 +0000 (0:00:00.355) 0:00:00.860 **** 2025-09-27 00:54:30.055890 | orchestrator | included: /ansible/roles/horizon/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:54:30.055901 | orchestrator | 2025-09-27 00:54:30.055912 | orchestrator | TASK [horizon : Ensuring config directories exist] ***************************** 2025-09-27 00:54:30.055923 | orchestrator | Saturday 27 September 2025 00:54:01 +0000 (0:00:00.418) 0:00:01.278 **** 2025-09-27 00:54:30.055940 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.056004 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.056020 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.056040 | orchestrator | 2025-09-27 00:54:30.056051 | orchestrator | TASK [horizon : Set empty custom policy] *************************************** 2025-09-27 00:54:30.056062 | orchestrator | Saturday 27 September 2025 00:54:02 +0000 (0:00:01.163) 0:00:02.441 **** 2025-09-27 00:54:30.056073 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.056218 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.056230 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.056241 | orchestrator | 2025-09-27 00:54:30.056251 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-27 00:54:30.056262 | orchestrator | Saturday 27 September 2025 00:54:02 +0000 (0:00:00.344) 0:00:02.785 **** 2025-09-27 00:54:30.056273 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'cloudkitty', 'enabled': False})  2025-09-27 00:54:30.056295 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'heat', 'enabled': 'no'})  2025-09-27 00:54:30.056307 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'ironic', 'enabled': False})  2025-09-27 00:54:30.056318 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'masakari', 'enabled': False})  2025-09-27 00:54:30.056328 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'mistral', 'enabled': False})  2025-09-27 00:54:30.056345 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'tacker', 'enabled': False})  2025-09-27 00:54:30.056357 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'trove', 'enabled': False})  2025-09-27 00:54:30.056367 | orchestrator | skipping: [testbed-node-0] => (item={'name': 'watcher', 'enabled': False})  2025-09-27 00:54:30.056378 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'cloudkitty', 'enabled': False})  2025-09-27 00:54:30.056389 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'heat', 'enabled': 'no'})  2025-09-27 00:54:30.056399 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'ironic', 'enabled': False})  2025-09-27 00:54:30.056410 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'masakari', 'enabled': False})  2025-09-27 00:54:30.056420 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'mistral', 'enabled': False})  2025-09-27 00:54:30.056431 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'tacker', 'enabled': False})  2025-09-27 00:54:30.056441 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'trove', 'enabled': False})  2025-09-27 00:54:30.056452 | orchestrator | skipping: [testbed-node-1] => (item={'name': 'watcher', 'enabled': False})  2025-09-27 00:54:30.056463 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'cloudkitty', 'enabled': False})  2025-09-27 00:54:30.056474 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'heat', 'enabled': 'no'})  2025-09-27 00:54:30.056484 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'ironic', 'enabled': False})  2025-09-27 00:54:30.056495 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'masakari', 'enabled': False})  2025-09-27 00:54:30.056505 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'mistral', 'enabled': False})  2025-09-27 00:54:30.056516 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'tacker', 'enabled': False})  2025-09-27 00:54:30.056535 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'trove', 'enabled': False})  2025-09-27 00:54:30.056546 | orchestrator | skipping: [testbed-node-2] => (item={'name': 'watcher', 'enabled': False})  2025-09-27 00:54:30.056557 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'ceilometer', 'enabled': 'yes'}) 2025-09-27 00:54:30.056570 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'cinder', 'enabled': 'yes'}) 2025-09-27 00:54:30.056581 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'designate', 'enabled': True}) 2025-09-27 00:54:30.056591 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'glance', 'enabled': True}) 2025-09-27 00:54:30.056602 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'keystone', 'enabled': True}) 2025-09-27 00:54:30.056613 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'magnum', 'enabled': True}) 2025-09-27 00:54:30.056623 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'manila', 'enabled': True}) 2025-09-27 00:54:30.056634 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'neutron', 'enabled': True}) 2025-09-27 00:54:30.056645 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'nova', 'enabled': True}) 2025-09-27 00:54:30.056656 | orchestrator | included: /ansible/roles/horizon/tasks/policy_item.yml for testbed-node-0, testbed-node-1, testbed-node-2 => (item={'name': 'octavia', 'enabled': True}) 2025-09-27 00:54:30.056667 | orchestrator | 2025-09-27 00:54:30.056678 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.056688 | orchestrator | Saturday 27 September 2025 00:54:03 +0000 (0:00:00.640) 0:00:03.426 **** 2025-09-27 00:54:30.056699 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.056710 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.056720 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.056731 | orchestrator | 2025-09-27 00:54:30.056742 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.056753 | orchestrator | Saturday 27 September 2025 00:54:03 +0000 (0:00:00.243) 0:00:03.670 **** 2025-09-27 00:54:30.056763 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.056774 | orchestrator | 2025-09-27 00:54:30.056790 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.056802 | orchestrator | Saturday 27 September 2025 00:54:03 +0000 (0:00:00.103) 0:00:03.773 **** 2025-09-27 00:54:30.056813 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.056824 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.056837 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.056850 | orchestrator | 2025-09-27 00:54:30.056863 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.056881 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.349) 0:00:04.123 **** 2025-09-27 00:54:30.056894 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.056907 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.056918 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.056929 | orchestrator | 2025-09-27 00:54:30.056940 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.056951 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.283) 0:00:04.406 **** 2025-09-27 00:54:30.056970 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.056980 | orchestrator | 2025-09-27 00:54:30.056991 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.057002 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.116) 0:00:04.523 **** 2025-09-27 00:54:30.057013 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057023 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.057034 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.057045 | orchestrator | 2025-09-27 00:54:30.057055 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.057066 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.264) 0:00:04.787 **** 2025-09-27 00:54:30.057096 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.057107 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.057117 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.057128 | orchestrator | 2025-09-27 00:54:30.057139 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.057149 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.288) 0:00:05.075 **** 2025-09-27 00:54:30.057160 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057171 | orchestrator | 2025-09-27 00:54:30.057182 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.057192 | orchestrator | Saturday 27 September 2025 00:54:05 +0000 (0:00:00.124) 0:00:05.200 **** 2025-09-27 00:54:30.057203 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057214 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.057225 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.057235 | orchestrator | 2025-09-27 00:54:30.057246 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.057257 | orchestrator | Saturday 27 September 2025 00:54:05 +0000 (0:00:00.561) 0:00:05.762 **** 2025-09-27 00:54:30.057268 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.057279 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.057289 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.057300 | orchestrator | 2025-09-27 00:54:30.057311 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.057322 | orchestrator | Saturday 27 September 2025 00:54:06 +0000 (0:00:00.341) 0:00:06.104 **** 2025-09-27 00:54:30.057332 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057343 | orchestrator | 2025-09-27 00:54:30.057354 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.057364 | orchestrator | Saturday 27 September 2025 00:54:06 +0000 (0:00:00.159) 0:00:06.264 **** 2025-09-27 00:54:30.057375 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057386 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.057396 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.057407 | orchestrator | 2025-09-27 00:54:30.057418 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.057429 | orchestrator | Saturday 27 September 2025 00:54:06 +0000 (0:00:00.293) 0:00:06.557 **** 2025-09-27 00:54:30.057439 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.057450 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.057461 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.057472 | orchestrator | 2025-09-27 00:54:30.057482 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.057493 | orchestrator | Saturday 27 September 2025 00:54:06 +0000 (0:00:00.296) 0:00:06.854 **** 2025-09-27 00:54:30.057504 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057515 | orchestrator | 2025-09-27 00:54:30.057525 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.057536 | orchestrator | Saturday 27 September 2025 00:54:07 +0000 (0:00:00.308) 0:00:07.162 **** 2025-09-27 00:54:30.057547 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057557 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.057574 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.057585 | orchestrator | 2025-09-27 00:54:30.057596 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.057607 | orchestrator | Saturday 27 September 2025 00:54:07 +0000 (0:00:00.299) 0:00:07.462 **** 2025-09-27 00:54:30.057617 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.057628 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.057639 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.057649 | orchestrator | 2025-09-27 00:54:30.057660 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.057671 | orchestrator | Saturday 27 September 2025 00:54:07 +0000 (0:00:00.300) 0:00:07.762 **** 2025-09-27 00:54:30.057682 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057692 | orchestrator | 2025-09-27 00:54:30.057703 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.057714 | orchestrator | Saturday 27 September 2025 00:54:07 +0000 (0:00:00.132) 0:00:07.894 **** 2025-09-27 00:54:30.057725 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057735 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.057746 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.057757 | orchestrator | 2025-09-27 00:54:30.057767 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.057784 | orchestrator | Saturday 27 September 2025 00:54:08 +0000 (0:00:00.277) 0:00:08.172 **** 2025-09-27 00:54:30.057795 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.057806 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.057817 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.057828 | orchestrator | 2025-09-27 00:54:30.057838 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.057849 | orchestrator | Saturday 27 September 2025 00:54:08 +0000 (0:00:00.457) 0:00:08.630 **** 2025-09-27 00:54:30.057860 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057871 | orchestrator | 2025-09-27 00:54:30.057887 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.057898 | orchestrator | Saturday 27 September 2025 00:54:08 +0000 (0:00:00.121) 0:00:08.751 **** 2025-09-27 00:54:30.057909 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.057920 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.057930 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.057941 | orchestrator | 2025-09-27 00:54:30.057952 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.057963 | orchestrator | Saturday 27 September 2025 00:54:09 +0000 (0:00:00.354) 0:00:09.105 **** 2025-09-27 00:54:30.057973 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.057984 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.057995 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.058006 | orchestrator | 2025-09-27 00:54:30.058229 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.058253 | orchestrator | Saturday 27 September 2025 00:54:09 +0000 (0:00:00.325) 0:00:09.431 **** 2025-09-27 00:54:30.058264 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.058275 | orchestrator | 2025-09-27 00:54:30.058286 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.058296 | orchestrator | Saturday 27 September 2025 00:54:09 +0000 (0:00:00.118) 0:00:09.550 **** 2025-09-27 00:54:30.058307 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.058318 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.058329 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.058339 | orchestrator | 2025-09-27 00:54:30.058350 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.058361 | orchestrator | Saturday 27 September 2025 00:54:09 +0000 (0:00:00.277) 0:00:09.828 **** 2025-09-27 00:54:30.058372 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.058383 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.058394 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.058414 | orchestrator | 2025-09-27 00:54:30.058425 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.058435 | orchestrator | Saturday 27 September 2025 00:54:10 +0000 (0:00:00.542) 0:00:10.370 **** 2025-09-27 00:54:30.058444 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.058454 | orchestrator | 2025-09-27 00:54:30.058464 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.058473 | orchestrator | Saturday 27 September 2025 00:54:10 +0000 (0:00:00.132) 0:00:10.503 **** 2025-09-27 00:54:30.058483 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.058493 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.058502 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.058512 | orchestrator | 2025-09-27 00:54:30.058521 | orchestrator | TASK [horizon : Update policy file name] *************************************** 2025-09-27 00:54:30.058531 | orchestrator | Saturday 27 September 2025 00:54:10 +0000 (0:00:00.307) 0:00:10.810 **** 2025-09-27 00:54:30.058541 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:30.058550 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:30.058560 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:30.058569 | orchestrator | 2025-09-27 00:54:30.058579 | orchestrator | TASK [horizon : Check if policies shall be overwritten] ************************ 2025-09-27 00:54:30.058589 | orchestrator | Saturday 27 September 2025 00:54:11 +0000 (0:00:00.348) 0:00:11.159 **** 2025-09-27 00:54:30.058598 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.058608 | orchestrator | 2025-09-27 00:54:30.058617 | orchestrator | TASK [horizon : Update custom policy file name] ******************************** 2025-09-27 00:54:30.058627 | orchestrator | Saturday 27 September 2025 00:54:11 +0000 (0:00:00.126) 0:00:11.285 **** 2025-09-27 00:54:30.058637 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.058646 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.058656 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.058665 | orchestrator | 2025-09-27 00:54:30.058675 | orchestrator | TASK [horizon : Copying over config.json files for services] ******************* 2025-09-27 00:54:30.058685 | orchestrator | Saturday 27 September 2025 00:54:11 +0000 (0:00:00.460) 0:00:11.746 **** 2025-09-27 00:54:30.058694 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:54:30.058704 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:54:30.058713 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:54:30.058723 | orchestrator | 2025-09-27 00:54:30.058732 | orchestrator | TASK [horizon : Copying over horizon.conf] ************************************* 2025-09-27 00:54:30.058742 | orchestrator | Saturday 27 September 2025 00:54:13 +0000 (0:00:01.679) 0:00:13.425 **** 2025-09-27 00:54:30.058752 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2025-09-27 00:54:30.058761 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2025-09-27 00:54:30.058771 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/horizon.conf.j2) 2025-09-27 00:54:30.058781 | orchestrator | 2025-09-27 00:54:30.058790 | orchestrator | TASK [horizon : Copying over kolla-settings.py] ******************************** 2025-09-27 00:54:30.058800 | orchestrator | Saturday 27 September 2025 00:54:15 +0000 (0:00:02.302) 0:00:15.727 **** 2025-09-27 00:54:30.058809 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2025-09-27 00:54:30.058819 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2025-09-27 00:54:30.058829 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9998-kolla-settings.py.j2) 2025-09-27 00:54:30.058838 | orchestrator | 2025-09-27 00:54:30.058848 | orchestrator | TASK [horizon : Copying over custom-settings.py] ******************************* 2025-09-27 00:54:30.058865 | orchestrator | Saturday 27 September 2025 00:54:18 +0000 (0:00:02.522) 0:00:18.250 **** 2025-09-27 00:54:30.058877 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2025-09-27 00:54:30.058895 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2025-09-27 00:54:30.058919 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/horizon/templates/_9999-custom-settings.py.j2) 2025-09-27 00:54:30.058931 | orchestrator | 2025-09-27 00:54:30.058942 | orchestrator | TASK [horizon : Copying over existing policy file] ***************************** 2025-09-27 00:54:30.058953 | orchestrator | Saturday 27 September 2025 00:54:20 +0000 (0:00:02.210) 0:00:20.461 **** 2025-09-27 00:54:30.058964 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.058975 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.058986 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.058997 | orchestrator | 2025-09-27 00:54:30.059008 | orchestrator | TASK [horizon : Copying over custom themes] ************************************ 2025-09-27 00:54:30.059019 | orchestrator | Saturday 27 September 2025 00:54:20 +0000 (0:00:00.317) 0:00:20.778 **** 2025-09-27 00:54:30.059031 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.059041 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.059053 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.059064 | orchestrator | 2025-09-27 00:54:30.059091 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-27 00:54:30.059102 | orchestrator | Saturday 27 September 2025 00:54:21 +0000 (0:00:00.317) 0:00:21.096 **** 2025-09-27 00:54:30.059114 | orchestrator | included: /ansible/roles/horizon/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:54:30.059125 | orchestrator | 2025-09-27 00:54:30.059136 | orchestrator | TASK [service-cert-copy : horizon | Copying over extra CA certificates] ******** 2025-09-27 00:54:30.059147 | orchestrator | Saturday 27 September 2025 00:54:21 +0000 (0:00:00.727) 0:00:21.823 **** 2025-09-27 00:54:30.059161 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.059190 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.059215 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.059233 | orchestrator | 2025-09-27 00:54:30.059243 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS certificate] *** 2025-09-27 00:54:30.059253 | orchestrator | Saturday 27 September 2025 00:54:23 +0000 (0:00:01.918) 0:00:23.741 **** 2025-09-27 00:54:30.059277 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:54:30.059289 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.059305 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:54:30.059329 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:54:30.059340 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.059350 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.059359 | orchestrator | 2025-09-27 00:54:30.059369 | orchestrator | TASK [service-cert-copy : horizon | Copying over backend internal TLS key] ***** 2025-09-27 00:54:30.059378 | orchestrator | Saturday 27 September 2025 00:54:24 +0000 (0:00:00.806) 0:00:24.548 **** 2025-09-27 00:54:30.059401 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:54:30.059419 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.059429 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:54:30.059440 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.059462 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}})  2025-09-27 00:54:30.059480 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.059490 | orchestrator | 2025-09-27 00:54:30.059499 | orchestrator | TASK [horizon : Deploy horizon container] ************************************** 2025-09-27 00:54:30.059509 | orchestrator | Saturday 27 September 2025 00:54:25 +0000 (0:00:00.994) 0:00:25.542 **** 2025-09-27 00:54:30.059519 | orchestrator | changed: [testbed-node-1] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.059548 | orchestrator | changed: [testbed-node-0] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.059560 | orchestrator | changed: [testbed-node-2] => (item={'key': 'horizon', 'value': {'container_name': 'horizon', 'group': 'horizon', 'enabled': True, 'image': 'registry.osism.tech/kolla/horizon:2024.2', 'environment': {'ENABLE_BLAZAR': 'no', 'ENABLE_CLOUDKITTY': 'no', 'ENABLE_DESIGNATE': 'yes', 'ENABLE_FWAAS': 'no', 'ENABLE_HEAT': 'no', 'ENABLE_IRONIC': 'no', 'ENABLE_MAGNUM': 'yes', 'ENABLE_MANILA': 'yes', 'ENABLE_MASAKARI': 'no', 'ENABLE_MISTRAL': 'no', 'ENABLE_NEUTRON_VPNAAS': 'no', 'ENABLE_OCTAVIA': 'yes', 'ENABLE_TACKER': 'no', 'ENABLE_TROVE': 'no', 'ENABLE_WATCHER': 'no', 'ENABLE_ZUN': 'no', 'FORCE_GENERATE': 'no'}, 'volumes': ['/etc/kolla/horizon/:/var/lib/kolla/config_files/:ro', '', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:80'], 'timeout': '30'}, 'haproxy': {'horizon': {'enabled': True, 'mode': 'http', 'external': False, 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_redirect': {'enabled': True, 'mode': 'redirect', 'external': False, 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'horizon_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '443', 'listen_port': '80', 'frontend_http_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }'], 'backend_http_extra': ['balance roundrobin'], 'tls_backend': 'no'}, 'horizon_external_redirect': {'enabled': True, 'mode': 'redirect', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '80', 'listen_port': '80', 'frontend_redirect_extra': ['use_backend acme_client_back if { path_reg ^/.well-known/acme-challenge/.+ }']}, 'acme_client': {'enabled': True, 'with_frontend': False, 'custom_member_list': []}}}}) 2025-09-27 00:54:30.059576 | orchestrator | 2025-09-27 00:54:30.059586 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-27 00:54:30.059596 | orchestrator | Saturday 27 September 2025 00:54:27 +0000 (0:00:02.137) 0:00:27.679 **** 2025-09-27 00:54:30.059605 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:30.059615 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:30.059624 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:30.059633 | orchestrator | 2025-09-27 00:54:30.059643 | orchestrator | TASK [horizon : include_tasks] ************************************************* 2025-09-27 00:54:30.059652 | orchestrator | Saturday 27 September 2025 00:54:27 +0000 (0:00:00.319) 0:00:27.999 **** 2025-09-27 00:54:30.059662 | orchestrator | included: /ansible/roles/horizon/tasks/bootstrap.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:54:30.059672 | orchestrator | 2025-09-27 00:54:30.059681 | orchestrator | TASK [horizon : Creating Horizon database] ************************************* 2025-09-27 00:54:30.059695 | orchestrator | Saturday 27 September 2025 00:54:28 +0000 (0:00:00.500) 0:00:28.499 **** 2025-09-27 00:54:30.059705 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:54:30.059715 | orchestrator | 2025-09-27 00:54:30.059724 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:54:30.059738 | orchestrator | testbed-node-0 : ok=33  changed=7  unreachable=0 failed=1  skipped=25  rescued=0 ignored=0 2025-09-27 00:54:30.059748 | orchestrator | testbed-node-1 : ok=33  changed=7  unreachable=0 failed=0 skipped=15  rescued=0 ignored=0 2025-09-27 00:54:30.059758 | orchestrator | testbed-node-2 : ok=33  changed=7  unreachable=0 failed=0 skipped=15  rescued=0 ignored=0 2025-09-27 00:54:30.059767 | orchestrator | 2025-09-27 00:54:30.059777 | orchestrator | 2025-09-27 00:54:30.059786 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:54:30.059796 | orchestrator | Saturday 27 September 2025 00:54:29 +0000 (0:00:00.811) 0:00:29.311 **** 2025-09-27 00:54:30.059805 | orchestrator | =============================================================================== 2025-09-27 00:54:30.059815 | orchestrator | horizon : Copying over kolla-settings.py -------------------------------- 2.52s 2025-09-27 00:54:30.059824 | orchestrator | horizon : Copying over horizon.conf ------------------------------------- 2.30s 2025-09-27 00:54:30.059833 | orchestrator | horizon : Copying over custom-settings.py ------------------------------- 2.21s 2025-09-27 00:54:30.059843 | orchestrator | horizon : Deploy horizon container -------------------------------------- 2.14s 2025-09-27 00:54:30.059852 | orchestrator | service-cert-copy : horizon | Copying over extra CA certificates -------- 1.92s 2025-09-27 00:54:30.059862 | orchestrator | horizon : Copying over config.json files for services ------------------- 1.68s 2025-09-27 00:54:30.059871 | orchestrator | horizon : Ensuring config directories exist ----------------------------- 1.16s 2025-09-27 00:54:30.059880 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS key ----- 0.99s 2025-09-27 00:54:30.059890 | orchestrator | horizon : Creating Horizon database ------------------------------------- 0.81s 2025-09-27 00:54:30.059899 | orchestrator | service-cert-copy : horizon | Copying over backend internal TLS certificate --- 0.81s 2025-09-27 00:54:30.059909 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.73s 2025-09-27 00:54:30.059927 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.64s 2025-09-27 00:54:30.059937 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.56s 2025-09-27 00:54:30.059946 | orchestrator | horizon : Update policy file name --------------------------------------- 0.54s 2025-09-27 00:54:30.059956 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.50s 2025-09-27 00:54:30.059965 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.46s 2025-09-27 00:54:30.059975 | orchestrator | horizon : Update policy file name --------------------------------------- 0.46s 2025-09-27 00:54:30.059984 | orchestrator | horizon : include_tasks ------------------------------------------------- 0.42s 2025-09-27 00:54:30.059994 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.36s 2025-09-27 00:54:30.060003 | orchestrator | horizon : Update custom policy file name -------------------------------- 0.35s 2025-09-27 00:54:30.060013 | orchestrator | 2025-09-27 00:54:30 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:30.060022 | orchestrator | 2025-09-27 00:54:30 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:30.060032 | orchestrator | 2025-09-27 00:54:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:33.093142 | orchestrator | 2025-09-27 00:54:33 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:33.095441 | orchestrator | 2025-09-27 00:54:33 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:33.095473 | orchestrator | 2025-09-27 00:54:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:36.160669 | orchestrator | 2025-09-27 00:54:36 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:36.161868 | orchestrator | 2025-09-27 00:54:36 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:36.162249 | orchestrator | 2025-09-27 00:54:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:39.205469 | orchestrator | 2025-09-27 00:54:39 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:39.206521 | orchestrator | 2025-09-27 00:54:39 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:39.206747 | orchestrator | 2025-09-27 00:54:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:42.262448 | orchestrator | 2025-09-27 00:54:42 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:42.264545 | orchestrator | 2025-09-27 00:54:42 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:42.264756 | orchestrator | 2025-09-27 00:54:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:45.308697 | orchestrator | 2025-09-27 00:54:45 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:45.308816 | orchestrator | 2025-09-27 00:54:45 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state STARTED 2025-09-27 00:54:45.308832 | orchestrator | 2025-09-27 00:54:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:48.343324 | orchestrator | 2025-09-27 00:54:48 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:54:48.345263 | orchestrator | 2025-09-27 00:54:48 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:54:48.345681 | orchestrator | 2025-09-27 00:54:48 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:54:48.346598 | orchestrator | 2025-09-27 00:54:48 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:48.347318 | orchestrator | 2025-09-27 00:54:48 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:54:48.349751 | orchestrator | 2025-09-27 00:54:48 | INFO  | Task 31d12d1f-fed8-4943-b1b7-173bc3b5ddd4 is in state SUCCESS 2025-09-27 00:54:48.351407 | orchestrator | 2025-09-27 00:54:48.351443 | orchestrator | 2025-09-27 00:54:48.351452 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:54:48.351462 | orchestrator | 2025-09-27 00:54:48.351471 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:54:48.351480 | orchestrator | Saturday 27 September 2025 00:54:00 +0000 (0:00:00.240) 0:00:00.240 **** 2025-09-27 00:54:48.351489 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:48.351499 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:48.351511 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:48.351519 | orchestrator | 2025-09-27 00:54:48.351528 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:54:48.351537 | orchestrator | Saturday 27 September 2025 00:54:00 +0000 (0:00:00.291) 0:00:00.532 **** 2025-09-27 00:54:48.351545 | orchestrator | ok: [testbed-node-0] => (item=enable_keystone_True) 2025-09-27 00:54:48.351555 | orchestrator | ok: [testbed-node-1] => (item=enable_keystone_True) 2025-09-27 00:54:48.351670 | orchestrator | ok: [testbed-node-2] => (item=enable_keystone_True) 2025-09-27 00:54:48.351680 | orchestrator | 2025-09-27 00:54:48.351688 | orchestrator | PLAY [Apply role keystone] ***************************************************** 2025-09-27 00:54:48.351697 | orchestrator | 2025-09-27 00:54:48.352405 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2025-09-27 00:54:48.352419 | orchestrator | Saturday 27 September 2025 00:54:00 +0000 (0:00:00.353) 0:00:00.885 **** 2025-09-27 00:54:48.352429 | orchestrator | included: /ansible/roles/keystone/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:54:48.352438 | orchestrator | 2025-09-27 00:54:48.352447 | orchestrator | TASK [keystone : Ensuring config directories exist] **************************** 2025-09-27 00:54:48.352456 | orchestrator | Saturday 27 September 2025 00:54:01 +0000 (0:00:00.496) 0:00:01.382 **** 2025-09-27 00:54:48.352469 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.352483 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.352524 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.352537 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352548 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352557 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352567 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352576 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352594 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352604 | orchestrator | 2025-09-27 00:54:48.352613 | orchestrator | TASK [keystone : Check if policies shall be overwritten] *********************** 2025-09-27 00:54:48.352622 | orchestrator | Saturday 27 September 2025 00:54:03 +0000 (0:00:01.742) 0:00:03.124 **** 2025-09-27 00:54:48.352636 | orchestrator | ok: [testbed-node-0 -> localhost] => (item=/opt/configuration/environments/kolla/files/overlays/keystone/policy.yaml) 2025-09-27 00:54:48.352646 | orchestrator | 2025-09-27 00:54:48.352655 | orchestrator | TASK [keystone : Set keystone policy file] ************************************* 2025-09-27 00:54:48.352663 | orchestrator | Saturday 27 September 2025 00:54:03 +0000 (0:00:00.693) 0:00:03.818 **** 2025-09-27 00:54:48.352672 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:48.352680 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:48.352689 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:48.352697 | orchestrator | 2025-09-27 00:54:48.352706 | orchestrator | TASK [keystone : Check if Keystone domain-specific config is supplied] ********* 2025-09-27 00:54:48.352715 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.372) 0:00:04.190 **** 2025-09-27 00:54:48.352723 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 00:54:48.352732 | orchestrator | 2025-09-27 00:54:48.352741 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2025-09-27 00:54:48.352750 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.630) 0:00:04.820 **** 2025-09-27 00:54:48.352758 | orchestrator | included: /ansible/roles/keystone/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:54:48.352767 | orchestrator | 2025-09-27 00:54:48.352776 | orchestrator | TASK [service-cert-copy : keystone | Copying over extra CA certificates] ******* 2025-09-27 00:54:48.352785 | orchestrator | Saturday 27 September 2025 00:54:05 +0000 (0:00:00.554) 0:00:05.374 **** 2025-09-27 00:54:48.352794 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.352804 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.352823 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.352839 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352848 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352858 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352867 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352884 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352897 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.352906 | orchestrator | 2025-09-27 00:54:48.352915 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS certificate] *** 2025-09-27 00:54:48.352923 | orchestrator | Saturday 27 September 2025 00:54:08 +0000 (0:00:03.373) 0:00:08.748 **** 2025-09-27 00:54:48.352939 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:54:48.352949 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.352958 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:54:48.352974 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:48.352984 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:54:48.352997 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353013 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:54:48.353023 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:48.353034 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:54:48.353045 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353061 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:54:48.353089 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:48.353100 | orchestrator | 2025-09-27 00:54:48.353110 | orchestrator | TASK [service-cert-copy : keystone | Copying over backend internal TLS key] **** 2025-09-27 00:54:48.353120 | orchestrator | Saturday 27 September 2025 00:54:09 +0000 (0:00:00.846) 0:00:09.595 **** 2025-09-27 00:54:48.353134 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:54:48.353150 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353162 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:54:48.353171 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:48.353182 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:54:48.353200 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353210 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:54:48.353220 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:48.353233 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}})  2025-09-27 00:54:48.353250 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353261 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}})  2025-09-27 00:54:48.353271 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:48.353286 | orchestrator | 2025-09-27 00:54:48.353296 | orchestrator | TASK [keystone : Copying over config.json files for services] ****************** 2025-09-27 00:54:48.353306 | orchestrator | Saturday 27 September 2025 00:54:10 +0000 (0:00:00.747) 0:00:10.342 **** 2025-09-27 00:54:48.353316 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353331 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353349 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353360 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353371 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353386 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353395 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353407 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353417 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353426 | orchestrator | 2025-09-27 00:54:48.353435 | orchestrator | TASK [keystone : Copying over keystone.conf] *********************************** 2025-09-27 00:54:48.353443 | orchestrator | Saturday 27 September 2025 00:54:13 +0000 (0:00:03.386) 0:00:13.728 **** 2025-09-27 00:54:48.353459 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353474 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353483 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353493 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353510 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353520 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353533 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353543 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353552 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353560 | orchestrator | 2025-09-27 00:54:48.353569 | orchestrator | TASK [keystone : Copying keystone-startup script for keystone] ***************** 2025-09-27 00:54:48.353578 | orchestrator | Saturday 27 September 2025 00:54:19 +0000 (0:00:06.079) 0:00:19.807 **** 2025-09-27 00:54:48.353587 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:54:48.353596 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:54:48.353604 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:54:48.353613 | orchestrator | 2025-09-27 00:54:48.353622 | orchestrator | TASK [keystone : Create Keystone domain-specific config directory] ************* 2025-09-27 00:54:48.353630 | orchestrator | Saturday 27 September 2025 00:54:21 +0000 (0:00:01.500) 0:00:21.308 **** 2025-09-27 00:54:48.353639 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:48.353647 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:48.353656 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:48.353664 | orchestrator | 2025-09-27 00:54:48.353673 | orchestrator | TASK [keystone : Get file list in custom domains folder] *********************** 2025-09-27 00:54:48.353687 | orchestrator | Saturday 27 September 2025 00:54:21 +0000 (0:00:00.650) 0:00:21.959 **** 2025-09-27 00:54:48.353696 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:48.353705 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:48.353713 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:48.353722 | orchestrator | 2025-09-27 00:54:48.353730 | orchestrator | TASK [keystone : Copying Keystone Domain specific settings] ******************** 2025-09-27 00:54:48.353739 | orchestrator | Saturday 27 September 2025 00:54:22 +0000 (0:00:00.339) 0:00:22.298 **** 2025-09-27 00:54:48.353747 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:48.353756 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:48.353764 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:48.353773 | orchestrator | 2025-09-27 00:54:48.353781 | orchestrator | TASK [keystone : Copying over existing policy file] **************************** 2025-09-27 00:54:48.353795 | orchestrator | Saturday 27 September 2025 00:54:22 +0000 (0:00:00.539) 0:00:22.837 **** 2025-09-27 00:54:48.353811 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353821 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353831 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353841 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353853 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.353872 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}})  2025-09-27 00:54:48.353882 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353891 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353900 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.353909 | orchestrator | 2025-09-27 00:54:48.353918 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2025-09-27 00:54:48.353926 | orchestrator | Saturday 27 September 2025 00:54:25 +0000 (0:00:02.494) 0:00:25.332 **** 2025-09-27 00:54:48.353935 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:48.353944 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:48.353952 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:48.353960 | orchestrator | 2025-09-27 00:54:48.353969 | orchestrator | TASK [keystone : Copying over wsgi-keystone.conf] ****************************** 2025-09-27 00:54:48.353977 | orchestrator | Saturday 27 September 2025 00:54:25 +0000 (0:00:00.424) 0:00:25.757 **** 2025-09-27 00:54:48.353986 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2025-09-27 00:54:48.353995 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2025-09-27 00:54:48.354003 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/keystone/templates/wsgi-keystone.conf.j2) 2025-09-27 00:54:48.354126 | orchestrator | 2025-09-27 00:54:48.354141 | orchestrator | TASK [keystone : Checking whether keystone-paste.ini file exists] ************** 2025-09-27 00:54:48.354154 | orchestrator | Saturday 27 September 2025 00:54:27 +0000 (0:00:02.059) 0:00:27.816 **** 2025-09-27 00:54:48.354163 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 00:54:48.354172 | orchestrator | 2025-09-27 00:54:48.354181 | orchestrator | TASK [keystone : Copying over keystone-paste.ini] ****************************** 2025-09-27 00:54:48.354189 | orchestrator | Saturday 27 September 2025 00:54:28 +0000 (0:00:00.909) 0:00:28.725 **** 2025-09-27 00:54:48.354198 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:48.354207 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:48.354215 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:48.354224 | orchestrator | 2025-09-27 00:54:48.354232 | orchestrator | TASK [keystone : Generate the required cron jobs for the node] ***************** 2025-09-27 00:54:48.354241 | orchestrator | Saturday 27 September 2025 00:54:29 +0000 (0:00:00.795) 0:00:29.520 **** 2025-09-27 00:54:48.354250 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-09-27 00:54:48.354258 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 00:54:48.354267 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-09-27 00:54:48.354276 | orchestrator | 2025-09-27 00:54:48.354284 | orchestrator | TASK [keystone : Set fact with the generated cron jobs for building the crontab later] *** 2025-09-27 00:54:48.354293 | orchestrator | Saturday 27 September 2025 00:54:30 +0000 (0:00:01.061) 0:00:30.582 **** 2025-09-27 00:54:48.354308 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:54:48.354316 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:54:48.354325 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:54:48.354334 | orchestrator | 2025-09-27 00:54:48.354342 | orchestrator | TASK [keystone : Copying files for keystone-fernet] **************************** 2025-09-27 00:54:48.354351 | orchestrator | Saturday 27 September 2025 00:54:30 +0000 (0:00:00.304) 0:00:30.886 **** 2025-09-27 00:54:48.354360 | orchestrator | changed: [testbed-node-0] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2025-09-27 00:54:48.354368 | orchestrator | changed: [testbed-node-1] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2025-09-27 00:54:48.354377 | orchestrator | changed: [testbed-node-2] => (item={'src': 'crontab.j2', 'dest': 'crontab'}) 2025-09-27 00:54:48.354386 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2025-09-27 00:54:48.354394 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2025-09-27 00:54:48.354403 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-rotate.sh.j2', 'dest': 'fernet-rotate.sh'}) 2025-09-27 00:54:48.354412 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2025-09-27 00:54:48.354420 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2025-09-27 00:54:48.354429 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-node-sync.sh.j2', 'dest': 'fernet-node-sync.sh'}) 2025-09-27 00:54:48.354438 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2025-09-27 00:54:48.354446 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2025-09-27 00:54:48.354455 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-push.sh.j2', 'dest': 'fernet-push.sh'}) 2025-09-27 00:54:48.354463 | orchestrator | changed: [testbed-node-0] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2025-09-27 00:54:48.354472 | orchestrator | changed: [testbed-node-1] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2025-09-27 00:54:48.354481 | orchestrator | changed: [testbed-node-2] => (item={'src': 'fernet-healthcheck.sh.j2', 'dest': 'fernet-healthcheck.sh'}) 2025-09-27 00:54:48.354489 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2025-09-27 00:54:48.354498 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2025-09-27 00:54:48.354513 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa', 'dest': 'id_rsa'}) 2025-09-27 00:54:48.354522 | orchestrator | changed: [testbed-node-0] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2025-09-27 00:54:48.354531 | orchestrator | changed: [testbed-node-1] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2025-09-27 00:54:48.354539 | orchestrator | changed: [testbed-node-2] => (item={'src': 'ssh_config.j2', 'dest': 'ssh_config'}) 2025-09-27 00:54:48.354548 | orchestrator | 2025-09-27 00:54:48.354556 | orchestrator | TASK [keystone : Copying files for keystone-ssh] ******************************* 2025-09-27 00:54:48.354565 | orchestrator | Saturday 27 September 2025 00:54:40 +0000 (0:00:09.136) 0:00:40.022 **** 2025-09-27 00:54:48.354573 | orchestrator | changed: [testbed-node-0] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2025-09-27 00:54:48.354582 | orchestrator | changed: [testbed-node-1] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2025-09-27 00:54:48.354591 | orchestrator | changed: [testbed-node-2] => (item={'src': 'sshd_config.j2', 'dest': 'sshd_config'}) 2025-09-27 00:54:48.354599 | orchestrator | changed: [testbed-node-0] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2025-09-27 00:54:48.354608 | orchestrator | changed: [testbed-node-1] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2025-09-27 00:54:48.354617 | orchestrator | changed: [testbed-node-2] => (item={'src': 'id_rsa.pub', 'dest': 'id_rsa.pub'}) 2025-09-27 00:54:48.354625 | orchestrator | 2025-09-27 00:54:48.354634 | orchestrator | TASK [keystone : Check keystone containers] ************************************ 2025-09-27 00:54:48.354643 | orchestrator | Saturday 27 September 2025 00:54:42 +0000 (0:00:02.921) 0:00:42.944 **** 2025-09-27 00:54:48.354660 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.10:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.354672 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.11:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.354682 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone', 'value': {'container_name': 'keystone', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone:2024.2', 'volumes': ['/etc/kolla/keystone/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '', 'kolla_logs:/var/log/kolla/', '', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_curl http://192.168.16.12:5000'], 'timeout': '30'}, 'haproxy': {'keystone_internal': {'enabled': True, 'mode': 'http', 'external': False, 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}, 'keystone_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'tls_backend': 'no', 'port': '5000', 'listen_port': '5000', 'backend_http_extra': ['balance roundrobin']}}}}) 2025-09-27 00:54:48.354697 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.354709 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.354719 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-ssh', 'value': {'container_name': 'keystone_ssh', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-ssh:2024.2', 'volumes': ['/etc/kolla/keystone-ssh/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', 'healthcheck_listen sshd 8023'], 'timeout': '30'}}}) 2025-09-27 00:54:48.354733 | orchestrator | changed: [testbed-node-1] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.354743 | orchestrator | changed: [testbed-node-0] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.354752 | orchestrator | changed: [testbed-node-2] => (item={'key': 'keystone-fernet', 'value': {'container_name': 'keystone_fernet', 'group': 'keystone', 'enabled': True, 'image': 'registry.osism.tech/kolla/keystone-fernet:2024.2', 'volumes': ['/etc/kolla/keystone-fernet/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'keystone_fernet_tokens:/etc/keystone/fernet-keys'], 'dimensions': {}, 'healthcheck': {'interval': '30', 'retries': '3', 'start_period': '5', 'test': ['CMD-SHELL', '/usr/bin/fernet-healthcheck.sh'], 'timeout': '30'}}}) 2025-09-27 00:54:48.354765 | orchestrator | 2025-09-27 00:54:48.354774 | orchestrator | TASK [keystone : include_tasks] ************************************************ 2025-09-27 00:54:48.354783 | orchestrator | Saturday 27 September 2025 00:54:45 +0000 (0:00:02.392) 0:00:45.337 **** 2025-09-27 00:54:48.354792 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:54:48.354800 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:54:48.354809 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:54:48.354818 | orchestrator | 2025-09-27 00:54:48.354826 | orchestrator | TASK [keystone : Creating keystone database] *********************************** 2025-09-27 00:54:48.354835 | orchestrator | Saturday 27 September 2025 00:54:45 +0000 (0:00:00.290) 0:00:45.627 **** 2025-09-27 00:54:48.354844 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:54:48.354852 | orchestrator | 2025-09-27 00:54:48.354861 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:54:48.354871 | orchestrator | testbed-node-0 : ok=20  changed=10  unreachable=0 failed=1  skipped=8  rescued=0 ignored=0 2025-09-27 00:54:48.354881 | orchestrator | testbed-node-1 : ok=17  changed=10  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2025-09-27 00:54:48.354891 | orchestrator | testbed-node-2 : ok=17  changed=10  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2025-09-27 00:54:48.354899 | orchestrator | 2025-09-27 00:54:48.354908 | orchestrator | 2025-09-27 00:54:48.354917 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:54:48.354925 | orchestrator | Saturday 27 September 2025 00:54:46 +0000 (0:00:00.721) 0:00:46.349 **** 2025-09-27 00:54:48.354934 | orchestrator | =============================================================================== 2025-09-27 00:54:48.354943 | orchestrator | keystone : Copying files for keystone-fernet ---------------------------- 9.14s 2025-09-27 00:54:48.354955 | orchestrator | keystone : Copying over keystone.conf ----------------------------------- 6.08s 2025-09-27 00:54:48.354964 | orchestrator | keystone : Copying over config.json files for services ------------------ 3.39s 2025-09-27 00:54:48.354972 | orchestrator | service-cert-copy : keystone | Copying over extra CA certificates ------- 3.37s 2025-09-27 00:54:48.354981 | orchestrator | keystone : Copying files for keystone-ssh ------------------------------- 2.92s 2025-09-27 00:54:48.354990 | orchestrator | keystone : Copying over existing policy file ---------------------------- 2.49s 2025-09-27 00:54:48.354998 | orchestrator | keystone : Check keystone containers ------------------------------------ 2.39s 2025-09-27 00:54:48.355007 | orchestrator | keystone : Copying over wsgi-keystone.conf ------------------------------ 2.06s 2025-09-27 00:54:48.355016 | orchestrator | keystone : Ensuring config directories exist ---------------------------- 1.74s 2025-09-27 00:54:48.355024 | orchestrator | keystone : Copying keystone-startup script for keystone ----------------- 1.50s 2025-09-27 00:54:48.355033 | orchestrator | keystone : Generate the required cron jobs for the node ----------------- 1.06s 2025-09-27 00:54:48.355041 | orchestrator | keystone : Checking whether keystone-paste.ini file exists -------------- 0.91s 2025-09-27 00:54:48.355054 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS certificate --- 0.85s 2025-09-27 00:54:48.355063 | orchestrator | keystone : Copying over keystone-paste.ini ------------------------------ 0.80s 2025-09-27 00:54:48.355088 | orchestrator | service-cert-copy : keystone | Copying over backend internal TLS key ---- 0.75s 2025-09-27 00:54:48.355102 | orchestrator | keystone : Creating keystone database ----------------------------------- 0.72s 2025-09-27 00:54:48.355111 | orchestrator | keystone : Check if policies shall be overwritten ----------------------- 0.69s 2025-09-27 00:54:48.355120 | orchestrator | keystone : Create Keystone domain-specific config directory ------------- 0.65s 2025-09-27 00:54:48.355128 | orchestrator | keystone : Check if Keystone domain-specific config is supplied --------- 0.63s 2025-09-27 00:54:48.355137 | orchestrator | keystone : include_tasks ------------------------------------------------ 0.55s 2025-09-27 00:54:48.355145 | orchestrator | 2025-09-27 00:54:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:51.377918 | orchestrator | 2025-09-27 00:54:51 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:54:51.378014 | orchestrator | 2025-09-27 00:54:51 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:54:51.378282 | orchestrator | 2025-09-27 00:54:51 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:54:51.378959 | orchestrator | 2025-09-27 00:54:51 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:51.379373 | orchestrator | 2025-09-27 00:54:51 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:54:51.379397 | orchestrator | 2025-09-27 00:54:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:54.410406 | orchestrator | 2025-09-27 00:54:54 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:54:54.411952 | orchestrator | 2025-09-27 00:54:54 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:54:54.413018 | orchestrator | 2025-09-27 00:54:54 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:54:54.414199 | orchestrator | 2025-09-27 00:54:54 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:54.415729 | orchestrator | 2025-09-27 00:54:54 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:54:54.415751 | orchestrator | 2025-09-27 00:54:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:54:57.456475 | orchestrator | 2025-09-27 00:54:57 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:54:57.458457 | orchestrator | 2025-09-27 00:54:57 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:54:57.459838 | orchestrator | 2025-09-27 00:54:57 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:54:57.461315 | orchestrator | 2025-09-27 00:54:57 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:54:57.462767 | orchestrator | 2025-09-27 00:54:57 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:54:57.462791 | orchestrator | 2025-09-27 00:54:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:00.513731 | orchestrator | 2025-09-27 00:55:00 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:00.516675 | orchestrator | 2025-09-27 00:55:00 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:00.518822 | orchestrator | 2025-09-27 00:55:00 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:00.521109 | orchestrator | 2025-09-27 00:55:00 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:00.523505 | orchestrator | 2025-09-27 00:55:00 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:00.523942 | orchestrator | 2025-09-27 00:55:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:03.574363 | orchestrator | 2025-09-27 00:55:03 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:03.576385 | orchestrator | 2025-09-27 00:55:03 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:03.578162 | orchestrator | 2025-09-27 00:55:03 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:03.579959 | orchestrator | 2025-09-27 00:55:03 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:03.582272 | orchestrator | 2025-09-27 00:55:03 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:03.582312 | orchestrator | 2025-09-27 00:55:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:06.634433 | orchestrator | 2025-09-27 00:55:06 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:06.637202 | orchestrator | 2025-09-27 00:55:06 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:06.639638 | orchestrator | 2025-09-27 00:55:06 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:06.642243 | orchestrator | 2025-09-27 00:55:06 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:06.644161 | orchestrator | 2025-09-27 00:55:06 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:06.644186 | orchestrator | 2025-09-27 00:55:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:09.696401 | orchestrator | 2025-09-27 00:55:09 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:09.698706 | orchestrator | 2025-09-27 00:55:09 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:09.700927 | orchestrator | 2025-09-27 00:55:09 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:09.702904 | orchestrator | 2025-09-27 00:55:09 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:09.704434 | orchestrator | 2025-09-27 00:55:09 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:09.704745 | orchestrator | 2025-09-27 00:55:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:12.756698 | orchestrator | 2025-09-27 00:55:12 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:12.758653 | orchestrator | 2025-09-27 00:55:12 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:12.761031 | orchestrator | 2025-09-27 00:55:12 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:12.763598 | orchestrator | 2025-09-27 00:55:12 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:12.765152 | orchestrator | 2025-09-27 00:55:12 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:12.765175 | orchestrator | 2025-09-27 00:55:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:15.815707 | orchestrator | 2025-09-27 00:55:15 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:15.818230 | orchestrator | 2025-09-27 00:55:15 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:15.820428 | orchestrator | 2025-09-27 00:55:15 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:15.823341 | orchestrator | 2025-09-27 00:55:15 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:15.824922 | orchestrator | 2025-09-27 00:55:15 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:15.825333 | orchestrator | 2025-09-27 00:55:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:18.872021 | orchestrator | 2025-09-27 00:55:18 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:18.873726 | orchestrator | 2025-09-27 00:55:18 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:18.876368 | orchestrator | 2025-09-27 00:55:18 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:18.878264 | orchestrator | 2025-09-27 00:55:18 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:18.879524 | orchestrator | 2025-09-27 00:55:18 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:18.879545 | orchestrator | 2025-09-27 00:55:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:21.936538 | orchestrator | 2025-09-27 00:55:21 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:21.939416 | orchestrator | 2025-09-27 00:55:21 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:21.942206 | orchestrator | 2025-09-27 00:55:21 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:21.943921 | orchestrator | 2025-09-27 00:55:21 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:21.947156 | orchestrator | 2025-09-27 00:55:21 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:21.947182 | orchestrator | 2025-09-27 00:55:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:24.991854 | orchestrator | 2025-09-27 00:55:24 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:24.993382 | orchestrator | 2025-09-27 00:55:24 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:24.994799 | orchestrator | 2025-09-27 00:55:24 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:24.995995 | orchestrator | 2025-09-27 00:55:24 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:24.998134 | orchestrator | 2025-09-27 00:55:24 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:24.998419 | orchestrator | 2025-09-27 00:55:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:28.050964 | orchestrator | 2025-09-27 00:55:28 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:28.053256 | orchestrator | 2025-09-27 00:55:28 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:28.055051 | orchestrator | 2025-09-27 00:55:28 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:28.057355 | orchestrator | 2025-09-27 00:55:28 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:28.058295 | orchestrator | 2025-09-27 00:55:28 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:28.058510 | orchestrator | 2025-09-27 00:55:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:31.106007 | orchestrator | 2025-09-27 00:55:31 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:31.108127 | orchestrator | 2025-09-27 00:55:31 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:31.111317 | orchestrator | 2025-09-27 00:55:31 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:31.112650 | orchestrator | 2025-09-27 00:55:31 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:31.114172 | orchestrator | 2025-09-27 00:55:31 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:31.114195 | orchestrator | 2025-09-27 00:55:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:34.164446 | orchestrator | 2025-09-27 00:55:34 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:34.166157 | orchestrator | 2025-09-27 00:55:34 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:34.167942 | orchestrator | 2025-09-27 00:55:34 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:34.172040 | orchestrator | 2025-09-27 00:55:34 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:34.174766 | orchestrator | 2025-09-27 00:55:34 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:34.174792 | orchestrator | 2025-09-27 00:55:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:37.217535 | orchestrator | 2025-09-27 00:55:37 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:37.218472 | orchestrator | 2025-09-27 00:55:37 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:37.219538 | orchestrator | 2025-09-27 00:55:37 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:37.220748 | orchestrator | 2025-09-27 00:55:37 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:37.221727 | orchestrator | 2025-09-27 00:55:37 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:37.221954 | orchestrator | 2025-09-27 00:55:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:40.275656 | orchestrator | 2025-09-27 00:55:40 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:40.280613 | orchestrator | 2025-09-27 00:55:40 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:40.284022 | orchestrator | 2025-09-27 00:55:40 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:40.285441 | orchestrator | 2025-09-27 00:55:40 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:40.286401 | orchestrator | 2025-09-27 00:55:40 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:40.286429 | orchestrator | 2025-09-27 00:55:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:43.335976 | orchestrator | 2025-09-27 00:55:43 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state STARTED 2025-09-27 00:55:43.337522 | orchestrator | 2025-09-27 00:55:43 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:43.341144 | orchestrator | 2025-09-27 00:55:43 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:43.345643 | orchestrator | 2025-09-27 00:55:43 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:43.347927 | orchestrator | 2025-09-27 00:55:43 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state STARTED 2025-09-27 00:55:43.347952 | orchestrator | 2025-09-27 00:55:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:46.396530 | orchestrator | 2025-09-27 00:55:46 | INFO  | Task f30d0322-1fdf-44d4-9ab0-6a026f50db97 is in state SUCCESS 2025-09-27 00:55:46.398525 | orchestrator | 2025-09-27 00:55:46 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:46.401602 | orchestrator | 2025-09-27 00:55:46 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:46.405433 | orchestrator | 2025-09-27 00:55:46 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:46.406564 | orchestrator | 2025-09-27 00:55:46 | INFO  | Task c9184fac-fcbd-41ba-a9c1-0709e4e59102 is in state SUCCESS 2025-09-27 00:55:46.409737 | orchestrator | 2025-09-27 00:55:46 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:55:46.410414 | orchestrator | 2025-09-27 00:55:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:49.468719 | orchestrator | 2025-09-27 00:55:49 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:49.469122 | orchestrator | 2025-09-27 00:55:49 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state STARTED 2025-09-27 00:55:49.470718 | orchestrator | 2025-09-27 00:55:49 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:49.471769 | orchestrator | 2025-09-27 00:55:49 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:55:49.473220 | orchestrator | 2025-09-27 00:55:49 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:55:49.473242 | orchestrator | 2025-09-27 00:55:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:52.520434 | orchestrator | 2025-09-27 00:55:52 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:52.521349 | orchestrator | 2025-09-27 00:55:52 | INFO  | Task d1ceef8b-3048-43ad-8f4d-705b5e676fb0 is in state SUCCESS 2025-09-27 00:55:52.522388 | orchestrator | 2025-09-27 00:55:52.522413 | orchestrator | 2025-09-27 00:55:52.522425 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:55:52.522437 | orchestrator | 2025-09-27 00:55:52.522448 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:55:52.522459 | orchestrator | Saturday 27 September 2025 00:54:50 +0000 (0:00:00.299) 0:00:00.299 **** 2025-09-27 00:55:52.522470 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:55:52.522481 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:55:52.522492 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:55:52.522503 | orchestrator | 2025-09-27 00:55:52.522513 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:55:52.522524 | orchestrator | Saturday 27 September 2025 00:54:50 +0000 (0:00:00.317) 0:00:00.616 **** 2025-09-27 00:55:52.522554 | orchestrator | ok: [testbed-node-0] => (item=enable_designate_True) 2025-09-27 00:55:52.522565 | orchestrator | ok: [testbed-node-1] => (item=enable_designate_True) 2025-09-27 00:55:52.522576 | orchestrator | ok: [testbed-node-2] => (item=enable_designate_True) 2025-09-27 00:55:52.522586 | orchestrator | 2025-09-27 00:55:52.522597 | orchestrator | PLAY [Apply role designate] **************************************************** 2025-09-27 00:55:52.522607 | orchestrator | 2025-09-27 00:55:52.522618 | orchestrator | TASK [designate : include_tasks] *********************************************** 2025-09-27 00:55:52.522629 | orchestrator | Saturday 27 September 2025 00:54:51 +0000 (0:00:00.369) 0:00:00.986 **** 2025-09-27 00:55:52.522639 | orchestrator | included: /ansible/roles/designate/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:55:52.522650 | orchestrator | 2025-09-27 00:55:52.522661 | orchestrator | TASK [service-ks-register : designate | Creating services] ********************* 2025-09-27 00:55:52.522672 | orchestrator | Saturday 27 September 2025 00:54:51 +0000 (0:00:00.490) 0:00:01.476 **** 2025-09-27 00:55:52.522705 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (5 retries left). 2025-09-27 00:55:52.522716 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (4 retries left). 2025-09-27 00:55:52.522726 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (3 retries left). 2025-09-27 00:55:52.522737 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (2 retries left). 2025-09-27 00:55:52.522747 | orchestrator | FAILED - RETRYING: [testbed-node-0]: designate | Creating services (1 retries left). 2025-09-27 00:55:52.522760 | orchestrator | failed: [testbed-node-0] (item=designate (dns)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Designate DNS Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9001"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9001"}], "name": "designate", "type": "dns"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:55:52.522774 | orchestrator | 2025-09-27 00:55:52.522785 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:55:52.522796 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:55:52.522808 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:55:52.522820 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:55:52.522848 | orchestrator | 2025-09-27 00:55:52.522859 | orchestrator | 2025-09-27 00:55:52.522870 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:55:52.522881 | orchestrator | Saturday 27 September 2025 00:55:44 +0000 (0:00:53.165) 0:00:54.642 **** 2025-09-27 00:55:52.522891 | orchestrator | =============================================================================== 2025-09-27 00:55:52.522902 | orchestrator | service-ks-register : designate | Creating services -------------------- 53.17s 2025-09-27 00:55:52.522913 | orchestrator | designate : include_tasks ----------------------------------------------- 0.49s 2025-09-27 00:55:52.522923 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.37s 2025-09-27 00:55:52.522934 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.32s 2025-09-27 00:55:52.522944 | orchestrator | 2025-09-27 00:55:52.522955 | orchestrator | 2025-09-27 00:55:52.522966 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:55:52.522976 | orchestrator | 2025-09-27 00:55:52.522987 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:55:52.522997 | orchestrator | Saturday 27 September 2025 00:54:50 +0000 (0:00:00.281) 0:00:00.281 **** 2025-09-27 00:55:52.523010 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:55:52.523022 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:55:52.523034 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:55:52.523046 | orchestrator | 2025-09-27 00:55:52.523086 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:55:52.523099 | orchestrator | Saturday 27 September 2025 00:54:51 +0000 (0:00:00.282) 0:00:00.564 **** 2025-09-27 00:55:52.523111 | orchestrator | ok: [testbed-node-0] => (item=enable_barbican_True) 2025-09-27 00:55:52.523124 | orchestrator | ok: [testbed-node-1] => (item=enable_barbican_True) 2025-09-27 00:55:52.523135 | orchestrator | ok: [testbed-node-2] => (item=enable_barbican_True) 2025-09-27 00:55:52.523148 | orchestrator | 2025-09-27 00:55:52.523160 | orchestrator | PLAY [Apply role barbican] ***************************************************** 2025-09-27 00:55:52.523172 | orchestrator | 2025-09-27 00:55:52.523184 | orchestrator | TASK [barbican : include_tasks] ************************************************ 2025-09-27 00:55:52.523207 | orchestrator | Saturday 27 September 2025 00:54:51 +0000 (0:00:00.312) 0:00:00.876 **** 2025-09-27 00:55:52.523229 | orchestrator | included: /ansible/roles/barbican/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:55:52.523241 | orchestrator | 2025-09-27 00:55:52.523254 | orchestrator | TASK [service-ks-register : barbican | Creating services] ********************** 2025-09-27 00:55:52.523266 | orchestrator | Saturday 27 September 2025 00:54:51 +0000 (0:00:00.410) 0:00:01.287 **** 2025-09-27 00:55:52.523278 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (5 retries left). 2025-09-27 00:55:52.523291 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (4 retries left). 2025-09-27 00:55:52.523309 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (3 retries left). 2025-09-27 00:55:52.523322 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (2 retries left). 2025-09-27 00:55:52.523334 | orchestrator | FAILED - RETRYING: [testbed-node-0]: barbican | Creating services (1 retries left). 2025-09-27 00:55:52.523348 | orchestrator | failed: [testbed-node-0] (item=barbican (key-manager)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Barbican Key Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9311"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9311"}], "name": "barbican", "type": "key-manager"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:55:52.523362 | orchestrator | 2025-09-27 00:55:52.523373 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:55:52.523384 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:55:52.523395 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:55:52.523406 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:55:52.523416 | orchestrator | 2025-09-27 00:55:52.523427 | orchestrator | 2025-09-27 00:55:52.523437 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:55:52.523448 | orchestrator | Saturday 27 September 2025 00:55:45 +0000 (0:00:53.234) 0:00:54.522 **** 2025-09-27 00:55:52.523459 | orchestrator | =============================================================================== 2025-09-27 00:55:52.523469 | orchestrator | service-ks-register : barbican | Creating services --------------------- 53.23s 2025-09-27 00:55:52.523587 | orchestrator | barbican : include_tasks ------------------------------------------------ 0.41s 2025-09-27 00:55:52.523599 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.31s 2025-09-27 00:55:52.523610 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.28s 2025-09-27 00:55:52.523621 | orchestrator | 2025-09-27 00:55:52.523631 | orchestrator | 2025-09-27 00:55:52.523642 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:55:52.523652 | orchestrator | 2025-09-27 00:55:52.523663 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:55:52.523674 | orchestrator | Saturday 27 September 2025 00:54:50 +0000 (0:00:00.273) 0:00:00.273 **** 2025-09-27 00:55:52.523684 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:55:52.523695 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:55:52.523706 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:55:52.523716 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:55:52.523727 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:55:52.523737 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:55:52.523748 | orchestrator | 2025-09-27 00:55:52.523759 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:55:52.523769 | orchestrator | Saturday 27 September 2025 00:54:51 +0000 (0:00:00.645) 0:00:00.918 **** 2025-09-27 00:55:52.523780 | orchestrator | ok: [testbed-node-0] => (item=enable_neutron_True) 2025-09-27 00:55:52.523800 | orchestrator | ok: [testbed-node-1] => (item=enable_neutron_True) 2025-09-27 00:55:52.523811 | orchestrator | ok: [testbed-node-2] => (item=enable_neutron_True) 2025-09-27 00:55:52.523821 | orchestrator | ok: [testbed-node-3] => (item=enable_neutron_True) 2025-09-27 00:55:52.523832 | orchestrator | ok: [testbed-node-4] => (item=enable_neutron_True) 2025-09-27 00:55:52.523843 | orchestrator | ok: [testbed-node-5] => (item=enable_neutron_True) 2025-09-27 00:55:52.523854 | orchestrator | 2025-09-27 00:55:52.523865 | orchestrator | PLAY [Apply role neutron] ****************************************************** 2025-09-27 00:55:52.523875 | orchestrator | 2025-09-27 00:55:52.523886 | orchestrator | TASK [neutron : include_tasks] ************************************************* 2025-09-27 00:55:52.523896 | orchestrator | Saturday 27 September 2025 00:54:52 +0000 (0:00:00.644) 0:00:01.563 **** 2025-09-27 00:55:52.523907 | orchestrator | included: /ansible/roles/neutron/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:55:52.523918 | orchestrator | 2025-09-27 00:55:52.523929 | orchestrator | TASK [neutron : Get container facts] ******************************************* 2025-09-27 00:55:52.523939 | orchestrator | Saturday 27 September 2025 00:54:53 +0000 (0:00:01.013) 0:00:02.577 **** 2025-09-27 00:55:52.523950 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:55:52.523961 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:55:52.523971 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:55:52.523982 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:55:52.523992 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:55:52.524003 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:55:52.524013 | orchestrator | 2025-09-27 00:55:52.524030 | orchestrator | TASK [neutron : Get container volume facts] ************************************ 2025-09-27 00:55:52.524042 | orchestrator | Saturday 27 September 2025 00:54:54 +0000 (0:00:01.252) 0:00:03.829 **** 2025-09-27 00:55:52.524052 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:55:52.524085 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:55:52.524096 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:55:52.524106 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:55:52.524117 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:55:52.524127 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:55:52.524138 | orchestrator | 2025-09-27 00:55:52.524149 | orchestrator | TASK [neutron : Check for ML2/OVN presence] ************************************ 2025-09-27 00:55:52.524160 | orchestrator | Saturday 27 September 2025 00:54:55 +0000 (0:00:01.126) 0:00:04.956 **** 2025-09-27 00:55:52.524170 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:55:52.524181 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:55:52.524198 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:55:52.524209 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:55:52.524220 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:55:52.524231 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:55:52.524241 | orchestrator | 2025-09-27 00:55:52.524252 | orchestrator | TASK [neutron : Check for ML2/OVS presence] ************************************ 2025-09-27 00:55:52.524263 | orchestrator | Saturday 27 September 2025 00:54:56 +0000 (0:00:00.785) 0:00:05.742 **** 2025-09-27 00:55:52.524273 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:55:52.524284 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:55:52.524295 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:55:52.524305 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:55:52.524316 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:55:52.524327 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:55:52.524337 | orchestrator | 2025-09-27 00:55:52.524348 | orchestrator | TASK [service-ks-register : neutron | Creating services] *********************** 2025-09-27 00:55:52.524359 | orchestrator | Saturday 27 September 2025 00:54:56 +0000 (0:00:00.609) 0:00:06.351 **** 2025-09-27 00:55:52.524369 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (5 retries left). 2025-09-27 00:55:52.524380 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (4 retries left). 2025-09-27 00:55:52.524391 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (3 retries left). 2025-09-27 00:55:52.524409 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (2 retries left). 2025-09-27 00:55:52.524419 | orchestrator | FAILED - RETRYING: [testbed-node-0]: neutron | Creating services (1 retries left). 2025-09-27 00:55:52.524431 | orchestrator | failed: [testbed-node-0] (item=neutron (network)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Openstack Networking", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9696"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9696"}], "name": "neutron", "type": "network"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:55:52.524442 | orchestrator | 2025-09-27 00:55:52.524453 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:55:52.524464 | orchestrator | testbed-node-0 : ok=5  changed=0 unreachable=0 failed=1  skipped=2  rescued=0 ignored=0 2025-09-27 00:55:52.524475 | orchestrator | testbed-node-1 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:55:52.524486 | orchestrator | testbed-node-2 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:55:52.524497 | orchestrator | testbed-node-3 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:55:52.524507 | orchestrator | testbed-node-4 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:55:52.524518 | orchestrator | testbed-node-5 : ok=5  changed=0 unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:55:52.524529 | orchestrator | 2025-09-27 00:55:52.524539 | orchestrator | 2025-09-27 00:55:52.524550 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:55:52.524561 | orchestrator | Saturday 27 September 2025 00:55:50 +0000 (0:00:53.228) 0:00:59.579 **** 2025-09-27 00:55:52.524572 | orchestrator | =============================================================================== 2025-09-27 00:55:52.524582 | orchestrator | service-ks-register : neutron | Creating services ---------------------- 53.23s 2025-09-27 00:55:52.524593 | orchestrator | neutron : Get container facts ------------------------------------------- 1.25s 2025-09-27 00:55:52.524603 | orchestrator | neutron : Get container volume facts ------------------------------------ 1.13s 2025-09-27 00:55:52.524614 | orchestrator | neutron : include_tasks ------------------------------------------------- 1.01s 2025-09-27 00:55:52.524625 | orchestrator | neutron : Check for ML2/OVN presence ------------------------------------ 0.79s 2025-09-27 00:55:52.524635 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.65s 2025-09-27 00:55:52.524646 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.64s 2025-09-27 00:55:52.524656 | orchestrator | neutron : Check for ML2/OVS presence ------------------------------------ 0.61s 2025-09-27 00:55:52.524673 | orchestrator | 2025-09-27 00:55:52 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:52.525348 | orchestrator | 2025-09-27 00:55:52 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:55:52.527361 | orchestrator | 2025-09-27 00:55:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:55:52.529782 | orchestrator | 2025-09-27 00:55:52 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:55:52.529804 | orchestrator | 2025-09-27 00:55:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:55.589193 | orchestrator | 2025-09-27 00:55:55 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:55.590240 | orchestrator | 2025-09-27 00:55:55 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:55.591772 | orchestrator | 2025-09-27 00:55:55 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:55:55.592878 | orchestrator | 2025-09-27 00:55:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:55:55.594923 | orchestrator | 2025-09-27 00:55:55 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:55:55.594949 | orchestrator | 2025-09-27 00:55:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:55:58.641525 | orchestrator | 2025-09-27 00:55:58 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:55:58.643409 | orchestrator | 2025-09-27 00:55:58 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:55:58.645591 | orchestrator | 2025-09-27 00:55:58 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:55:58.647517 | orchestrator | 2025-09-27 00:55:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:55:58.648810 | orchestrator | 2025-09-27 00:55:58 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:55:58.648910 | orchestrator | 2025-09-27 00:55:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:01.702402 | orchestrator | 2025-09-27 00:56:01 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:01.706792 | orchestrator | 2025-09-27 00:56:01 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:56:01.711234 | orchestrator | 2025-09-27 00:56:01 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:01.713272 | orchestrator | 2025-09-27 00:56:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:01.716790 | orchestrator | 2025-09-27 00:56:01 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:01.716820 | orchestrator | 2025-09-27 00:56:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:04.765146 | orchestrator | 2025-09-27 00:56:04 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:04.767316 | orchestrator | 2025-09-27 00:56:04 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:56:04.769643 | orchestrator | 2025-09-27 00:56:04 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:04.772857 | orchestrator | 2025-09-27 00:56:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:04.774986 | orchestrator | 2025-09-27 00:56:04 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:04.775222 | orchestrator | 2025-09-27 00:56:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:07.829905 | orchestrator | 2025-09-27 00:56:07 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:07.833226 | orchestrator | 2025-09-27 00:56:07 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state STARTED 2025-09-27 00:56:07.835088 | orchestrator | 2025-09-27 00:56:07 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:07.836918 | orchestrator | 2025-09-27 00:56:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:07.839121 | orchestrator | 2025-09-27 00:56:07 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:07.839461 | orchestrator | 2025-09-27 00:56:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:10.887238 | orchestrator | 2025-09-27 00:56:10 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:10.892615 | orchestrator | 2025-09-27 00:56:10 | INFO  | Task cd555ed5-77dc-4122-95c3-86725144bacd is in state SUCCESS 2025-09-27 00:56:10.894596 | orchestrator | 2025-09-27 00:56:10.894688 | orchestrator | 2025-09-27 00:56:10.894703 | orchestrator | PLAY [Create ceph pools] ******************************************************* 2025-09-27 00:56:10.894716 | orchestrator | 2025-09-27 00:56:10.894728 | orchestrator | TASK [ceph-facts : Include facts.yml] ****************************************** 2025-09-27 00:56:10.894739 | orchestrator | Saturday 27 September 2025 00:54:00 +0000 (0:00:00.543) 0:00:00.543 **** 2025-09-27 00:56:10.894751 | orchestrator | included: /ansible/roles/ceph-facts/tasks/facts.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:56:10.894763 | orchestrator | 2025-09-27 00:56:10.894791 | orchestrator | TASK [ceph-facts : Check if it is atomic host] ********************************* 2025-09-27 00:56:10.894802 | orchestrator | Saturday 27 September 2025 00:54:00 +0000 (0:00:00.489) 0:00:01.032 **** 2025-09-27 00:56:10.894813 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.894825 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.894836 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.894847 | orchestrator | 2025-09-27 00:56:10.894858 | orchestrator | TASK [ceph-facts : Set_fact is_atomic] ***************************************** 2025-09-27 00:56:10.894869 | orchestrator | Saturday 27 September 2025 00:54:01 +0000 (0:00:00.689) 0:00:01.721 **** 2025-09-27 00:56:10.894880 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.894890 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.894901 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.894912 | orchestrator | 2025-09-27 00:56:10.894923 | orchestrator | TASK [ceph-facts : Check if podman binary is present] ************************** 2025-09-27 00:56:10.894934 | orchestrator | Saturday 27 September 2025 00:54:01 +0000 (0:00:00.276) 0:00:01.997 **** 2025-09-27 00:56:10.894944 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.894955 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.894966 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.894977 | orchestrator | 2025-09-27 00:56:10.894988 | orchestrator | TASK [ceph-facts : Set_fact container_binary] ********************************** 2025-09-27 00:56:10.894999 | orchestrator | Saturday 27 September 2025 00:54:02 +0000 (0:00:00.726) 0:00:02.724 **** 2025-09-27 00:56:10.895010 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.895021 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.895032 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.895043 | orchestrator | 2025-09-27 00:56:10.895099 | orchestrator | TASK [ceph-facts : Set_fact ceph_cmd] ****************************************** 2025-09-27 00:56:10.895112 | orchestrator | Saturday 27 September 2025 00:54:02 +0000 (0:00:00.265) 0:00:02.989 **** 2025-09-27 00:56:10.895123 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.895136 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.895148 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.895160 | orchestrator | 2025-09-27 00:56:10.895172 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python] ********************* 2025-09-27 00:56:10.895185 | orchestrator | Saturday 27 September 2025 00:54:02 +0000 (0:00:00.263) 0:00:03.252 **** 2025-09-27 00:56:10.895197 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.895210 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.895222 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.895235 | orchestrator | 2025-09-27 00:56:10.895248 | orchestrator | TASK [ceph-facts : Set_fact discovered_interpreter_python if not previously set] *** 2025-09-27 00:56:10.895262 | orchestrator | Saturday 27 September 2025 00:54:03 +0000 (0:00:00.258) 0:00:03.510 **** 2025-09-27 00:56:10.895274 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.895287 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.895300 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.895334 | orchestrator | 2025-09-27 00:56:10.895347 | orchestrator | TASK [ceph-facts : Set_fact ceph_release ceph_stable_release] ****************** 2025-09-27 00:56:10.895360 | orchestrator | Saturday 27 September 2025 00:54:03 +0000 (0:00:00.359) 0:00:03.870 **** 2025-09-27 00:56:10.895373 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.895384 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.895396 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.895409 | orchestrator | 2025-09-27 00:56:10.895421 | orchestrator | TASK [ceph-facts : Set_fact monitor_name ansible_facts['hostname']] ************ 2025-09-27 00:56:10.895434 | orchestrator | Saturday 27 September 2025 00:54:03 +0000 (0:00:00.238) 0:00:04.109 **** 2025-09-27 00:56:10.895447 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-27 00:56:10.895459 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:56:10.895471 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:56:10.895483 | orchestrator | 2025-09-27 00:56:10.895494 | orchestrator | TASK [ceph-facts : Set_fact container_exec_cmd] ******************************** 2025-09-27 00:56:10.895505 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.565) 0:00:04.675 **** 2025-09-27 00:56:10.895515 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.895526 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.895537 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.895547 | orchestrator | 2025-09-27 00:56:10.895558 | orchestrator | TASK [ceph-facts : Find a running mon container] ******************************* 2025-09-27 00:56:10.895569 | orchestrator | Saturday 27 September 2025 00:54:04 +0000 (0:00:00.409) 0:00:05.084 **** 2025-09-27 00:56:10.895579 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-27 00:56:10.895590 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:56:10.895601 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:56:10.895611 | orchestrator | 2025-09-27 00:56:10.895622 | orchestrator | TASK [ceph-facts : Check for a ceph mon socket] ******************************** 2025-09-27 00:56:10.895633 | orchestrator | Saturday 27 September 2025 00:54:06 +0000 (0:00:02.192) 0:00:07.277 **** 2025-09-27 00:56:10.895643 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-27 00:56:10.895654 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-27 00:56:10.895665 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-27 00:56:10.895675 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.895686 | orchestrator | 2025-09-27 00:56:10.895697 | orchestrator | TASK [ceph-facts : Check if the ceph mon socket is in-use] ********************* 2025-09-27 00:56:10.895762 | orchestrator | Saturday 27 September 2025 00:54:07 +0000 (0:00:00.398) 0:00:07.675 **** 2025-09-27 00:56:10.895777 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.895799 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.895811 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.895822 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.895833 | orchestrator | 2025-09-27 00:56:10.895844 | orchestrator | TASK [ceph-facts : Set_fact running_mon - non_container] *********************** 2025-09-27 00:56:10.895855 | orchestrator | Saturday 27 September 2025 00:54:07 +0000 (0:00:00.813) 0:00:08.489 **** 2025-09-27 00:56:10.895868 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.895890 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.895902 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': {'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'not containerized_deployment | bool', 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.895913 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.895924 | orchestrator | 2025-09-27 00:56:10.895935 | orchestrator | TASK [ceph-facts : Set_fact running_mon - container] *************************** 2025-09-27 00:56:10.895946 | orchestrator | Saturday 27 September 2025 00:54:08 +0000 (0:00:00.192) 0:00:08.682 **** 2025-09-27 00:56:10.895958 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': 'e3791e9e687e', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-0'], 'start': '2025-09-27 00:54:05.229148', 'end': '2025-09-27 00:54:05.287108', 'delta': '0:00:00.057960', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-0', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['e3791e9e687e'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-0', 'ansible_loop_var': 'item'}) 2025-09-27 00:56:10.895973 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': 'f45de2e76022', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-1'], 'start': '2025-09-27 00:54:06.050807', 'end': '2025-09-27 00:54:06.104003', 'delta': '0:00:00.053196', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-1', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['f45de2e76022'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-1', 'ansible_loop_var': 'item'}) 2025-09-27 00:56:10.896042 | orchestrator | ok: [testbed-node-3] => (item={'changed': False, 'stdout': '5a1dab212744', 'stderr': '', 'rc': 0, 'cmd': ['docker', 'ps', '-q', '--filter', 'name=ceph-mon-testbed-node-2'], 'start': '2025-09-27 00:54:06.586759', 'end': '2025-09-27 00:54:06.638134', 'delta': '0:00:00.051375', 'msg': '', 'invocation': {'module_args': {'_raw_params': 'docker ps -q --filter name=ceph-mon-testbed-node-2', '_uses_shell': False, 'expand_argument_vars': True, 'stdin_add_newline': True, 'strip_empty_ends': True, 'argv': None, 'chdir': None, 'executable': None, 'creates': None, 'removes': None, 'stdin': None}}, 'stdout_lines': ['5a1dab212744'], 'stderr_lines': [], 'failed': False, 'failed_when_result': False, 'item': 'testbed-node-2', 'ansible_loop_var': 'item'}) 2025-09-27 00:56:10.896081 | orchestrator | 2025-09-27 00:56:10.896093 | orchestrator | TASK [ceph-facts : Set_fact _container_exec_cmd] ******************************* 2025-09-27 00:56:10.896112 | orchestrator | Saturday 27 September 2025 00:54:08 +0000 (0:00:00.363) 0:00:09.045 **** 2025-09-27 00:56:10.896123 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.896134 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.896145 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.896156 | orchestrator | 2025-09-27 00:56:10.896167 | orchestrator | TASK [ceph-facts : Get current fsid if cluster is already running] ************* 2025-09-27 00:56:10.896177 | orchestrator | Saturday 27 September 2025 00:54:09 +0000 (0:00:00.483) 0:00:09.528 **** 2025-09-27 00:56:10.896188 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] 2025-09-27 00:56:10.896199 | orchestrator | 2025-09-27 00:56:10.896210 | orchestrator | TASK [ceph-facts : Set_fact current_fsid rc 1] ********************************* 2025-09-27 00:56:10.896221 | orchestrator | Saturday 27 September 2025 00:54:10 +0000 (0:00:01.760) 0:00:11.288 **** 2025-09-27 00:56:10.896232 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896242 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896253 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896264 | orchestrator | 2025-09-27 00:56:10.896274 | orchestrator | TASK [ceph-facts : Get current fsid] ******************************************* 2025-09-27 00:56:10.896285 | orchestrator | Saturday 27 September 2025 00:54:11 +0000 (0:00:00.301) 0:00:11.590 **** 2025-09-27 00:56:10.896296 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896306 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896317 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896327 | orchestrator | 2025-09-27 00:56:10.896338 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2025-09-27 00:56:10.896349 | orchestrator | Saturday 27 September 2025 00:54:11 +0000 (0:00:00.427) 0:00:12.018 **** 2025-09-27 00:56:10.896360 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896370 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896381 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896391 | orchestrator | 2025-09-27 00:56:10.896402 | orchestrator | TASK [ceph-facts : Set_fact fsid from current_fsid] **************************** 2025-09-27 00:56:10.896413 | orchestrator | Saturday 27 September 2025 00:54:12 +0000 (0:00:00.477) 0:00:12.495 **** 2025-09-27 00:56:10.896423 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.896434 | orchestrator | 2025-09-27 00:56:10.896445 | orchestrator | TASK [ceph-facts : Generate cluster fsid] ************************************** 2025-09-27 00:56:10.896455 | orchestrator | Saturday 27 September 2025 00:54:12 +0000 (0:00:00.137) 0:00:12.633 **** 2025-09-27 00:56:10.896466 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896477 | orchestrator | 2025-09-27 00:56:10.896487 | orchestrator | TASK [ceph-facts : Set_fact fsid] ********************************************** 2025-09-27 00:56:10.896498 | orchestrator | Saturday 27 September 2025 00:54:12 +0000 (0:00:00.226) 0:00:12.860 **** 2025-09-27 00:56:10.896508 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896519 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896530 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896541 | orchestrator | 2025-09-27 00:56:10.896551 | orchestrator | TASK [ceph-facts : Resolve device link(s)] ************************************* 2025-09-27 00:56:10.896562 | orchestrator | Saturday 27 September 2025 00:54:12 +0000 (0:00:00.282) 0:00:13.142 **** 2025-09-27 00:56:10.896573 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896583 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896594 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896605 | orchestrator | 2025-09-27 00:56:10.896615 | orchestrator | TASK [ceph-facts : Set_fact build devices from resolved symlinks] ************** 2025-09-27 00:56:10.896626 | orchestrator | Saturday 27 September 2025 00:54:12 +0000 (0:00:00.325) 0:00:13.467 **** 2025-09-27 00:56:10.896637 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896648 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896658 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896669 | orchestrator | 2025-09-27 00:56:10.896680 | orchestrator | TASK [ceph-facts : Resolve dedicated_device link(s)] *************************** 2025-09-27 00:56:10.896698 | orchestrator | Saturday 27 September 2025 00:54:13 +0000 (0:00:00.507) 0:00:13.975 **** 2025-09-27 00:56:10.896708 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896719 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896729 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896740 | orchestrator | 2025-09-27 00:56:10.896751 | orchestrator | TASK [ceph-facts : Set_fact build dedicated_devices from resolved symlinks] **** 2025-09-27 00:56:10.896762 | orchestrator | Saturday 27 September 2025 00:54:13 +0000 (0:00:00.330) 0:00:14.306 **** 2025-09-27 00:56:10.896772 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896783 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896794 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896804 | orchestrator | 2025-09-27 00:56:10.896815 | orchestrator | TASK [ceph-facts : Resolve bluestore_wal_device link(s)] *********************** 2025-09-27 00:56:10.896826 | orchestrator | Saturday 27 September 2025 00:54:14 +0000 (0:00:00.338) 0:00:14.644 **** 2025-09-27 00:56:10.896837 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896847 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896858 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896869 | orchestrator | 2025-09-27 00:56:10.896880 | orchestrator | TASK [ceph-facts : Set_fact build bluestore_wal_devices from resolved symlinks] *** 2025-09-27 00:56:10.896925 | orchestrator | Saturday 27 September 2025 00:54:14 +0000 (0:00:00.301) 0:00:14.946 **** 2025-09-27 00:56:10.896938 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.896949 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.896960 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.896971 | orchestrator | 2025-09-27 00:56:10.896981 | orchestrator | TASK [ceph-facts : Collect existed devices] ************************************ 2025-09-27 00:56:10.896992 | orchestrator | Saturday 27 September 2025 00:54:14 +0000 (0:00:00.510) 0:00:15.457 **** 2025-09-27 00:56:10.897010 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e62f59a6--4044--5e93--b85c--9f8cca280e9f-osd--block--e62f59a6--4044--5e93--b85c--9f8cca280e9f', 'dm-uuid-LVM-2j0R2lsOV7uYm5mBjwcNbbSh1kQKC6aWFs37eraHciH3dG7KX2uUyrR9le9M1Sxc'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897022 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--025d8a54--72cd--5dfc--843f--2890244ba468-osd--block--025d8a54--72cd--5dfc--843f--2890244ba468', 'dm-uuid-LVM-XHCwvuU1sjTmaK85YDSmR6G7sbpVpAP3JMUNdJjHiYoiRZ0xWzEN1AgLJsu20b10'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897034 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--634a63d2--bd22--5328--9676--28392545ed43-osd--block--634a63d2--bd22--5328--9676--28392545ed43', 'dm-uuid-LVM-UlAlFHjSEGexCmx3gfRFT7UDwjIGr9mS6RTZCSiLBafcbpblurexRxLJlqNlUnkx'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897045 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06-osd--block--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06', 'dm-uuid-LVM-K7Add9racGU2L9Njoe4PiYwcIpDjr05MSre6J2Y3OxofcXM429pZ0szxlstbspnD'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897084 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897096 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--9ca7935d--e986--5962--b530--505e6c7ac609-osd--block--9ca7935d--e986--5962--b530--505e6c7ac609', 'dm-uuid-LVM-bDhFdDwT54ouiDFbfCRj6kE8iH1XDG316Ib1iSwqIA7E8LyFNu82J4CqDZUvs2si'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897142 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897160 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--26537eb5--d37a--51fe--a7ad--0ae3582304de-osd--block--26537eb5--d37a--51fe--a7ad--0ae3582304de', 'dm-uuid-LVM-9u1Et2K86e5ar3MRSDs86Q4n2GlHJ9LdwOYDbas8h12ne7B2BpRv09Athluk4exq'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897172 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897184 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897195 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897207 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897225 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897236 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897247 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897290 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897303 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897315 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897326 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897337 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897348 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897366 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897409 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897421 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897432 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897477 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897498 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part1', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part14', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part15', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part16', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897528 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdb', 'value': {'holders': ['ceph--e62f59a6--4044--5e93--b85c--9f8cca280e9f-osd--block--e62f59a6--4044--5e93--b85c--9f8cca280e9f'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Me9aue-lek3-JPg0-VYec-326H-ZuKM-XDWaPz', 'scsi-0QEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b', 'scsi-SQEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897554 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part1', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part14', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part15', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part16', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897568 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdc', 'value': {'holders': ['ceph--634a63d2--bd22--5328--9676--28392545ed43-osd--block--634a63d2--bd22--5328--9676--28392545ed43'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-MLfaDW-hcSX-UUuz-T6hf-jnwD-2Ymd-7lmoLK', 'scsi-0QEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766', 'scsi-SQEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897587 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897599 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdb', 'value': {'holders': ['ceph--025d8a54--72cd--5dfc--843f--2890244ba468-osd--block--025d8a54--72cd--5dfc--843f--2890244ba468'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-9rawNn-y563-rGNc-kwv8-GzbT-nvxJ-Bf2wvf', 'scsi-0QEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e', 'scsi-SQEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897611 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408', 'scsi-SQEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897634 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdc', 'value': {'holders': ['ceph--9ca7935d--e986--5962--b530--505e6c7ac609-osd--block--9ca7935d--e986--5962--b530--505e6c7ac609'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-SGXdOj-WXQT-SPT5-jcYu-xrdU-Qh2y-VcSwqS', 'scsi-0QEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696', 'scsi-SQEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897651 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-11-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897663 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897675 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc', 'scsi-SQEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897693 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.897704 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897716 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-13-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897727 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}})  2025-09-27 00:56:10.897738 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.897763 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part1', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part14', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part15', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part16', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897782 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdb', 'value': {'holders': ['ceph--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06-osd--block--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-3Sibgw-UrUF-sguV-cbL6-UeGk-LmaD-n7sO1o', 'scsi-0QEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d', 'scsi-SQEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897794 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdc', 'value': {'holders': ['ceph--26537eb5--d37a--51fe--a7ad--0ae3582304de-osd--block--26537eb5--d37a--51fe--a7ad--0ae3582304de'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-RSp72m-6Gr3-OGR0-2DWW-9yZY-dCTz-l1m7o8', 'scsi-0QEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f', 'scsi-SQEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897806 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706', 'scsi-SQEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897823 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-15-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}})  2025-09-27 00:56:10.897834 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.897845 | orchestrator | 2025-09-27 00:56:10.897856 | orchestrator | TASK [ceph-facts : Set_fact devices generate device list when osd_auto_discovery] *** 2025-09-27 00:56:10.897867 | orchestrator | Saturday 27 September 2025 00:54:15 +0000 (0:00:00.659) 0:00:16.117 **** 2025-09-27 00:56:10.897884 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--025d8a54--72cd--5dfc--843f--2890244ba468-osd--block--025d8a54--72cd--5dfc--843f--2890244ba468', 'dm-uuid-LVM-XHCwvuU1sjTmaK85YDSmR6G7sbpVpAP3JMUNdJjHiYoiRZ0xWzEN1AgLJsu20b10'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.897901 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--9ca7935d--e986--5962--b530--505e6c7ac609-osd--block--9ca7935d--e986--5962--b530--505e6c7ac609', 'dm-uuid-LVM-bDhFdDwT54ouiDFbfCRj6kE8iH1XDG316Ib1iSwqIA7E8LyFNu82J4CqDZUvs2si'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.897913 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.897924 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.897935 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.897960 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--e62f59a6--4044--5e93--b85c--9f8cca280e9f-osd--block--e62f59a6--4044--5e93--b85c--9f8cca280e9f', 'dm-uuid-LVM-2j0R2lsOV7uYm5mBjwcNbbSh1kQKC6aWFs37eraHciH3dG7KX2uUyrR9le9M1Sxc'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.897973 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--634a63d2--bd22--5328--9676--28392545ed43-osd--block--634a63d2--bd22--5328--9676--28392545ed43', 'dm-uuid-LVM-UlAlFHjSEGexCmx3gfRFT7UDwjIGr9mS6RTZCSiLBafcbpblurexRxLJlqNlUnkx'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.897990 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898002 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898013 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898143 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898164 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898181 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898205 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898217 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898229 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898255 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part1', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part14', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part15', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part16', 'scsi-SQEMU_QEMU_HARDDISK_9830fc06-72ca-4b97-ae11-006364930d3a-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898275 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898288 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--025d8a54--72cd--5dfc--843f--2890244ba468-osd--block--025d8a54--72cd--5dfc--843f--2890244ba468'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-9rawNn-y563-rGNc-kwv8-GzbT-nvxJ-Bf2wvf', 'scsi-0QEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e', 'scsi-SQEMU_QEMU_HARDDISK_db689dff-d74e-43e3-a305-79ec0de29e1e'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898299 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898311 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--9ca7935d--e986--5962--b530--505e6c7ac609-osd--block--9ca7935d--e986--5962--b530--505e6c7ac609'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-SGXdOj-WXQT-SPT5-jcYu-xrdU-Qh2y-VcSwqS', 'scsi-0QEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696', 'scsi-SQEMU_QEMU_HARDDISK_aa54db64-5ca4-4f56-bafa-5b00a4002696'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898338 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc', 'scsi-SQEMU_QEMU_HARDDISK_09efdf41-dbe9-4aba-b0d6-c49a377077cc'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898357 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898369 | orchestrator | skipping: [testbed-node-3] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-13-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898381 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898406 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part1', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part14', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part15', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part16', 'scsi-SQEMU_QEMU_HARDDISK_ccfdba47-3be1-47ca-9d9b-c14dc21688e2-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898426 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.898437 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--e62f59a6--4044--5e93--b85c--9f8cca280e9f-osd--block--e62f59a6--4044--5e93--b85c--9f8cca280e9f'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-Me9aue-lek3-JPg0-VYec-326H-ZuKM-XDWaPz', 'scsi-0QEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b', 'scsi-SQEMU_QEMU_HARDDISK_e258aa1c-ff59-4b5b-956f-d2cfc00f460b'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898450 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--634a63d2--bd22--5328--9676--28392545ed43-osd--block--634a63d2--bd22--5328--9676--28392545ed43'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-MLfaDW-hcSX-UUuz-T6hf-jnwD-2Ymd-7lmoLK', 'scsi-0QEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766', 'scsi-SQEMU_QEMU_HARDDISK_f6166654-1631-4845-81e5-73fa20742766'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898462 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408', 'scsi-SQEMU_QEMU_HARDDISK_88b94aa1-4c02-44af-bedb-78cbed569408'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898481 | orchestrator | skipping: [testbed-node-4] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-11-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898499 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.898515 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-0', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06-osd--block--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06', 'dm-uuid-LVM-K7Add9racGU2L9Njoe4PiYwcIpDjr05MSre6J2Y3OxofcXM429pZ0szxlstbspnD'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898527 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'dm-1', 'value': {'holders': [], 'host': '', 'links': {'ids': ['dm-name-ceph--26537eb5--d37a--51fe--a7ad--0ae3582304de-osd--block--26537eb5--d37a--51fe--a7ad--0ae3582304de', 'dm-uuid-LVM-9u1Et2K86e5ar3MRSDs86Q4n2GlHJ9LdwOYDbas8h12ne7B2BpRv09Athluk4exq'], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': '', 'sectors': 41934848, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898539 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop0', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898550 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop1', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898561 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop2', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898579 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop3', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898601 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop4', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898612 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop5', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898624 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop6', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898635 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'loop7', 'value': {'holders': [], 'host': '', 'links': {'ids': [], 'labels': [], 'masters': [], 'uuids': []}, 'model': None, 'partitions': {}, 'removable': '0', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 0, 'sectorsize': '512', 'size': '0.00 Bytes', 'support_discard': '0', 'vendor': None, 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898660 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sda', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {'sda1': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part1', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part1'], 'labels': ['cloudimg-rootfs'], 'masters': [], 'uuids': ['b852d8d2-8460-44aa-8998-23e4f04d73cf']}, 'sectors': 165672927, 'sectorsize': 512, 'size': '79.00 GB', 'start': '2099200', 'uuid': 'b852d8d2-8460-44aa-8998-23e4f04d73cf'}, 'sda14': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part14', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part14'], 'labels': [], 'masters': [], 'uuids': []}, 'sectors': 8192, 'sectorsize': 512, 'size': '4.00 MB', 'start': '2048', 'uuid': None}, 'sda15': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part15', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part15'], 'labels': ['UEFI'], 'masters': [], 'uuids': ['5C78-612A']}, 'sectors': 217088, 'sectorsize': 512, 'size': '106.00 MB', 'start': '10240', 'uuid': '5C78-612A'}, 'sda16': {'holders': [], 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part16', 'scsi-SQEMU_QEMU_HARDDISK_140074d2-f452-403c-a39b-9b7f03d301d0-part16'], 'labels': ['BOOT'], 'masters': [], 'uuids': ['09d53dc1-1e03-4286-bbb8-2b1796cf92ec']}, 'sectors': 1869825, 'sectorsize': 512, 'size': '913.00 MB', 'start': '227328', 'uuid': '09d53dc1-1e03-4286-bbb8-2b1796cf92ec'}}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 167772160, 'sectorsize': '512', 'size': '80.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898680 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdb', 'value': {'holders': ['ceph--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06-osd--block--03e94b17--8e91--5aba--9ae0--0b9f0a63cf06'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-3Sibgw-UrUF-sguV-cbL6-UeGk-LmaD-n7sO1o', 'scsi-0QEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d', 'scsi-SQEMU_QEMU_HARDDISK_44ee43e4-0ad4-479b-91ef-60ee60e7859d'], 'labels': [], 'masters': ['dm-0'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898691 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdc', 'value': {'holders': ['ceph--26537eb5--d37a--51fe--a7ad--0ae3582304de-osd--block--26537eb5--d37a--51fe--a7ad--0ae3582304de'], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['lvm-pv-uuid-RSp72m-6Gr3-OGR0-2DWW-9yZY-dCTz-l1m7o8', 'scsi-0QEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f', 'scsi-SQEMU_QEMU_HARDDISK_3491b7a4-1f4d-422d-b24b-7572a092bd2f'], 'labels': [], 'masters': ['dm-1'], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898703 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sdd', 'value': {'holders': [], 'host': 'SCSI storage controller: Red Hat, Inc. Virtio SCSI', 'links': {'ids': ['scsi-0QEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706', 'scsi-SQEMU_QEMU_HARDDISK_06352aa6-6cdc-4b09-96e0-787a93e7d706'], 'labels': [], 'masters': [], 'uuids': []}, 'model': 'QEMU HARDDISK', 'partitions': {}, 'removable': '0', 'rotational': '1', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'none', 'sectors': 41943040, 'sectorsize': '512', 'size': '20.00 GB', 'support_discard': '4096', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898720 | orchestrator | skipping: [testbed-node-5] => (item={'changed': False, 'skipped': True, 'skip_reason': 'Conditional result was False', 'false_condition': 'osd_auto_discovery | default(False) | bool', 'item': {'key': 'sr0', 'value': {'holders': [], 'host': 'IDE interface: Intel Corporation 82371SB PIIX3 IDE [Natoma/Triton II]', 'links': {'ids': ['ata-QEMU_DVD-ROM_QM00001'], 'labels': ['config-2'], 'masters': [], 'uuids': ['2025-09-27-00-02-15-00']}, 'model': 'QEMU DVD-ROM', 'partitions': {}, 'removable': '1', 'rotational': '0', 'sas_address': None, 'sas_device_handle': None, 'scheduler_mode': 'mq-deadline', 'sectors': 253, 'sectorsize': '2048', 'size': '506.00 KB', 'support_discard': '0', 'vendor': 'QEMU', 'virtual': 1}}, 'ansible_loop_var': 'item'})  2025-09-27 00:56:10.898738 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.898749 | orchestrator | 2025-09-27 00:56:10.898761 | orchestrator | TASK [ceph-facts : Check if the ceph conf exists] ****************************** 2025-09-27 00:56:10.898776 | orchestrator | Saturday 27 September 2025 00:54:16 +0000 (0:00:00.787) 0:00:16.905 **** 2025-09-27 00:56:10.898788 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.898799 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.898809 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.898820 | orchestrator | 2025-09-27 00:56:10.898831 | orchestrator | TASK [ceph-facts : Set default osd_pool_default_crush_rule fact] *************** 2025-09-27 00:56:10.898842 | orchestrator | Saturday 27 September 2025 00:54:17 +0000 (0:00:00.685) 0:00:17.591 **** 2025-09-27 00:56:10.898853 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.898864 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.898874 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.898885 | orchestrator | 2025-09-27 00:56:10.898896 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2025-09-27 00:56:10.898907 | orchestrator | Saturday 27 September 2025 00:54:17 +0000 (0:00:00.504) 0:00:18.095 **** 2025-09-27 00:56:10.898917 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.898928 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.898939 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.898950 | orchestrator | 2025-09-27 00:56:10.898961 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2025-09-27 00:56:10.898972 | orchestrator | Saturday 27 September 2025 00:54:19 +0000 (0:00:01.721) 0:00:19.817 **** 2025-09-27 00:56:10.898982 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.898993 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.899004 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.899015 | orchestrator | 2025-09-27 00:56:10.899026 | orchestrator | TASK [ceph-facts : Read osd pool default crush rule] *************************** 2025-09-27 00:56:10.899036 | orchestrator | Saturday 27 September 2025 00:54:19 +0000 (0:00:00.304) 0:00:20.121 **** 2025-09-27 00:56:10.899047 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.899074 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.899085 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.899096 | orchestrator | 2025-09-27 00:56:10.899107 | orchestrator | TASK [ceph-facts : Set osd_pool_default_crush_rule fact] *********************** 2025-09-27 00:56:10.899118 | orchestrator | Saturday 27 September 2025 00:54:20 +0000 (0:00:00.411) 0:00:20.533 **** 2025-09-27 00:56:10.899128 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.899139 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.899150 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.899161 | orchestrator | 2025-09-27 00:56:10.899172 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv4] ************************* 2025-09-27 00:56:10.899183 | orchestrator | Saturday 27 September 2025 00:54:20 +0000 (0:00:00.504) 0:00:21.038 **** 2025-09-27 00:56:10.899194 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-0) 2025-09-27 00:56:10.899204 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-0) 2025-09-27 00:56:10.899215 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-0) 2025-09-27 00:56:10.899226 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-1) 2025-09-27 00:56:10.899237 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-1) 2025-09-27 00:56:10.899248 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-1) 2025-09-27 00:56:10.899266 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-2) 2025-09-27 00:56:10.899277 | orchestrator | ok: [testbed-node-4] => (item=testbed-node-2) 2025-09-27 00:56:10.899287 | orchestrator | ok: [testbed-node-5] => (item=testbed-node-2) 2025-09-27 00:56:10.899298 | orchestrator | 2025-09-27 00:56:10.899309 | orchestrator | TASK [ceph-facts : Set_fact _monitor_addresses - ipv6] ************************* 2025-09-27 00:56:10.899320 | orchestrator | Saturday 27 September 2025 00:54:21 +0000 (0:00:00.892) 0:00:21.930 **** 2025-09-27 00:56:10.899331 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-0)  2025-09-27 00:56:10.899342 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-1)  2025-09-27 00:56:10.899352 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-2)  2025-09-27 00:56:10.899363 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.899374 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-0)  2025-09-27 00:56:10.899385 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-1)  2025-09-27 00:56:10.899396 | orchestrator | skipping: [testbed-node-4] => (item=testbed-node-2)  2025-09-27 00:56:10.899406 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.899417 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-0)  2025-09-27 00:56:10.899428 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-1)  2025-09-27 00:56:10.899438 | orchestrator | skipping: [testbed-node-5] => (item=testbed-node-2)  2025-09-27 00:56:10.899449 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.899460 | orchestrator | 2025-09-27 00:56:10.899471 | orchestrator | TASK [ceph-facts : Import_tasks set_radosgw_address.yml] *********************** 2025-09-27 00:56:10.899481 | orchestrator | Saturday 27 September 2025 00:54:21 +0000 (0:00:00.397) 0:00:22.328 **** 2025-09-27 00:56:10.899492 | orchestrator | included: /ansible/roles/ceph-facts/tasks/set_radosgw_address.yml for testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:56:10.899504 | orchestrator | 2025-09-27 00:56:10.899515 | orchestrator | TASK [ceph-facts : Set current radosgw_address_block, radosgw_address, radosgw_interface from node "{{ ceph_dashboard_call_item }}"] *** 2025-09-27 00:56:10.899526 | orchestrator | Saturday 27 September 2025 00:54:22 +0000 (0:00:00.716) 0:00:23.044 **** 2025-09-27 00:56:10.899537 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.899547 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.899558 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.899569 | orchestrator | 2025-09-27 00:56:10.899585 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv4] **** 2025-09-27 00:56:10.899596 | orchestrator | Saturday 27 September 2025 00:54:22 +0000 (0:00:00.320) 0:00:23.365 **** 2025-09-27 00:56:10.899607 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.899617 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.899628 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.899639 | orchestrator | 2025-09-27 00:56:10.899650 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address_block ipv6] **** 2025-09-27 00:56:10.899666 | orchestrator | Saturday 27 September 2025 00:54:23 +0000 (0:00:00.307) 0:00:23.672 **** 2025-09-27 00:56:10.899677 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.899688 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.899699 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:56:10.899709 | orchestrator | 2025-09-27 00:56:10.899720 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_address] *************** 2025-09-27 00:56:10.899731 | orchestrator | Saturday 27 September 2025 00:54:23 +0000 (0:00:00.293) 0:00:23.965 **** 2025-09-27 00:56:10.899742 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.899753 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.899764 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.899775 | orchestrator | 2025-09-27 00:56:10.899786 | orchestrator | TASK [ceph-facts : Set_fact _interface] **************************************** 2025-09-27 00:56:10.899796 | orchestrator | Saturday 27 September 2025 00:54:24 +0000 (0:00:00.687) 0:00:24.653 **** 2025-09-27 00:56:10.899814 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:56:10.899825 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:56:10.899836 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:56:10.899846 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.899857 | orchestrator | 2025-09-27 00:56:10.899868 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv4] ****** 2025-09-27 00:56:10.899879 | orchestrator | Saturday 27 September 2025 00:54:24 +0000 (0:00:00.430) 0:00:25.083 **** 2025-09-27 00:56:10.899890 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:56:10.899901 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:56:10.899911 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:56:10.899922 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.899933 | orchestrator | 2025-09-27 00:56:10.899943 | orchestrator | TASK [ceph-facts : Set_fact _radosgw_address to radosgw_interface - ipv6] ****** 2025-09-27 00:56:10.899954 | orchestrator | Saturday 27 September 2025 00:54:25 +0000 (0:00:00.419) 0:00:25.503 **** 2025-09-27 00:56:10.899965 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-3)  2025-09-27 00:56:10.899976 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-4)  2025-09-27 00:56:10.899986 | orchestrator | skipping: [testbed-node-3] => (item=testbed-node-5)  2025-09-27 00:56:10.899997 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.900008 | orchestrator | 2025-09-27 00:56:10.900019 | orchestrator | TASK [ceph-facts : Reset rgw_instances (workaround)] *************************** 2025-09-27 00:56:10.900030 | orchestrator | Saturday 27 September 2025 00:54:25 +0000 (0:00:00.361) 0:00:25.864 **** 2025-09-27 00:56:10.900040 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:56:10.900051 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:56:10.900112 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:56:10.900123 | orchestrator | 2025-09-27 00:56:10.900134 | orchestrator | TASK [ceph-facts : Set_fact rgw_instances] ************************************* 2025-09-27 00:56:10.900145 | orchestrator | Saturday 27 September 2025 00:54:25 +0000 (0:00:00.366) 0:00:26.231 **** 2025-09-27 00:56:10.900156 | orchestrator | ok: [testbed-node-3] => (item=0) 2025-09-27 00:56:10.900167 | orchestrator | ok: [testbed-node-4] => (item=0) 2025-09-27 00:56:10.900177 | orchestrator | ok: [testbed-node-5] => (item=0) 2025-09-27 00:56:10.900188 | orchestrator | 2025-09-27 00:56:10.900199 | orchestrator | TASK [ceph-facts : Set_fact ceph_run_cmd] ************************************** 2025-09-27 00:56:10.900210 | orchestrator | Saturday 27 September 2025 00:54:26 +0000 (0:00:00.684) 0:00:26.915 **** 2025-09-27 00:56:10.900221 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-27 00:56:10.900232 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:56:10.900243 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:56:10.900253 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2025-09-27 00:56:10.900264 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2025-09-27 00:56:10.900275 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2025-09-27 00:56:10.900286 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2025-09-27 00:56:10.900297 | orchestrator | 2025-09-27 00:56:10.900308 | orchestrator | TASK [ceph-facts : Set_fact ceph_admin_command] ******************************** 2025-09-27 00:56:10.900318 | orchestrator | Saturday 27 September 2025 00:54:27 +0000 (0:00:00.967) 0:00:27.883 **** 2025-09-27 00:56:10.900328 | orchestrator | ok: [testbed-node-3 -> testbed-node-0(192.168.16.10)] => (item=testbed-node-0) 2025-09-27 00:56:10.900338 | orchestrator | ok: [testbed-node-3 -> testbed-node-1(192.168.16.11)] => (item=testbed-node-1) 2025-09-27 00:56:10.900347 | orchestrator | ok: [testbed-node-3 -> testbed-node-2(192.168.16.12)] => (item=testbed-node-2) 2025-09-27 00:56:10.900369 | orchestrator | ok: [testbed-node-3] => (item=testbed-node-3) 2025-09-27 00:56:10.900379 | orchestrator | ok: [testbed-node-3 -> testbed-node-4(192.168.16.14)] => (item=testbed-node-4) 2025-09-27 00:56:10.900389 | orchestrator | ok: [testbed-node-3 -> testbed-node-5(192.168.16.15)] => (item=testbed-node-5) 2025-09-27 00:56:10.900399 | orchestrator | ok: [testbed-node-3 -> testbed-manager(192.168.16.5)] => (item=testbed-manager) 2025-09-27 00:56:10.900408 | orchestrator | 2025-09-27 00:56:10.900423 | orchestrator | TASK [Include tasks from the ceph-osd role] ************************************ 2025-09-27 00:56:10.900433 | orchestrator | Saturday 27 September 2025 00:54:29 +0000 (0:00:01.967) 0:00:29.851 **** 2025-09-27 00:56:10.900443 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:56:10.900452 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:56:10.900462 | orchestrator | included: /ansible/tasks/openstack_config.yml for testbed-node-5 2025-09-27 00:56:10.900472 | orchestrator | 2025-09-27 00:56:10.900481 | orchestrator | TASK [create openstack pool(s)] ************************************************ 2025-09-27 00:56:10.900496 | orchestrator | Saturday 27 September 2025 00:54:29 +0000 (0:00:00.415) 0:00:30.266 **** 2025-09-27 00:56:10.900507 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'backups', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-27 00:56:10.900517 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'volumes', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-27 00:56:10.900528 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'images', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-27 00:56:10.900538 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'metrics', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-27 00:56:10.900548 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item={'application': 'rbd', 'erasure_profile': '', 'expected_num_objects': '', 'min_size': 0, 'name': 'vms', 'pg_autoscale_mode': False, 'pg_num': 32, 'pgp_num': 32, 'rule_name': 'replicated_rule', 'size': 3, 'type': 1}) 2025-09-27 00:56:10.900558 | orchestrator | 2025-09-27 00:56:10.900568 | orchestrator | TASK [generate keys] *********************************************************** 2025-09-27 00:56:10.900577 | orchestrator | Saturday 27 September 2025 00:55:14 +0000 (0:00:44.747) 0:01:15.013 **** 2025-09-27 00:56:10.900587 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900597 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900606 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900616 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900626 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900635 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900645 | orchestrator | changed: [testbed-node-5 -> {{ groups[mon_group_name][0] }}] 2025-09-27 00:56:10.900654 | orchestrator | 2025-09-27 00:56:10.900664 | orchestrator | TASK [get keys from monitors] ************************************************** 2025-09-27 00:56:10.900674 | orchestrator | Saturday 27 September 2025 00:55:39 +0000 (0:00:24.589) 0:01:39.603 **** 2025-09-27 00:56:10.900689 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900699 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900708 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900718 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900728 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900737 | orchestrator | ok: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900747 | orchestrator | ok: [testbed-node-5 -> {{ groups.get(mon_group_name)[0] }}] 2025-09-27 00:56:10.900756 | orchestrator | 2025-09-27 00:56:10.900766 | orchestrator | TASK [copy ceph key(s) if needed] ********************************************** 2025-09-27 00:56:10.900776 | orchestrator | Saturday 27 September 2025 00:55:51 +0000 (0:00:12.060) 0:01:51.663 **** 2025-09-27 00:56:10.900785 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900795 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-27 00:56:10.900804 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-27 00:56:10.900814 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900823 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-27 00:56:10.900833 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-27 00:56:10.900848 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900857 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-27 00:56:10.900867 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-27 00:56:10.900877 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900886 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-27 00:56:10.900900 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-27 00:56:10.900911 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900920 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-27 00:56:10.900930 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-27 00:56:10.900939 | orchestrator | changed: [testbed-node-5 -> testbed-node-0(192.168.16.10)] => (item=None) 2025-09-27 00:56:10.900949 | orchestrator | changed: [testbed-node-5 -> testbed-node-1(192.168.16.11)] => (item=None) 2025-09-27 00:56:10.900959 | orchestrator | changed: [testbed-node-5 -> testbed-node-2(192.168.16.12)] => (item=None) 2025-09-27 00:56:10.900968 | orchestrator | changed: [testbed-node-5 -> {{ item.1 }}] 2025-09-27 00:56:10.900978 | orchestrator | 2025-09-27 00:56:10.900988 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:56:10.900998 | orchestrator | testbed-node-3 : ok=25  changed=0 unreachable=0 failed=0 skipped=28  rescued=0 ignored=0 2025-09-27 00:56:10.901009 | orchestrator | testbed-node-4 : ok=18  changed=0 unreachable=0 failed=0 skipped=21  rescued=0 ignored=0 2025-09-27 00:56:10.901019 | orchestrator | testbed-node-5 : ok=23  changed=3  unreachable=0 failed=0 skipped=20  rescued=0 ignored=0 2025-09-27 00:56:10.901029 | orchestrator | 2025-09-27 00:56:10.901039 | orchestrator | 2025-09-27 00:56:10.901048 | orchestrator | 2025-09-27 00:56:10.901072 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:56:10.901082 | orchestrator | Saturday 27 September 2025 00:56:09 +0000 (0:00:18.425) 0:02:10.089 **** 2025-09-27 00:56:10.901097 | orchestrator | =============================================================================== 2025-09-27 00:56:10.901107 | orchestrator | create openstack pool(s) ----------------------------------------------- 44.75s 2025-09-27 00:56:10.901117 | orchestrator | generate keys ---------------------------------------------------------- 24.59s 2025-09-27 00:56:10.901126 | orchestrator | copy ceph key(s) if needed --------------------------------------------- 18.43s 2025-09-27 00:56:10.901136 | orchestrator | get keys from monitors ------------------------------------------------- 12.06s 2025-09-27 00:56:10.901145 | orchestrator | ceph-facts : Find a running mon container ------------------------------- 2.19s 2025-09-27 00:56:10.901155 | orchestrator | ceph-facts : Set_fact ceph_admin_command -------------------------------- 1.97s 2025-09-27 00:56:10.901165 | orchestrator | ceph-facts : Get current fsid if cluster is already running ------------- 1.76s 2025-09-27 00:56:10.901174 | orchestrator | ceph-facts : Read osd pool default crush rule --------------------------- 1.72s 2025-09-27 00:56:10.901184 | orchestrator | ceph-facts : Set_fact ceph_run_cmd -------------------------------------- 0.97s 2025-09-27 00:56:10.901193 | orchestrator | ceph-facts : Set_fact _monitor_addresses - ipv4 ------------------------- 0.89s 2025-09-27 00:56:10.901203 | orchestrator | ceph-facts : Check if the ceph mon socket is in-use --------------------- 0.81s 2025-09-27 00:56:10.901213 | orchestrator | ceph-facts : Set_fact devices generate device list when osd_auto_discovery --- 0.79s 2025-09-27 00:56:10.901222 | orchestrator | ceph-facts : Check if podman binary is present -------------------------- 0.73s 2025-09-27 00:56:10.901232 | orchestrator | ceph-facts : Import_tasks set_radosgw_address.yml ----------------------- 0.72s 2025-09-27 00:56:10.901241 | orchestrator | ceph-facts : Check if it is atomic host --------------------------------- 0.69s 2025-09-27 00:56:10.901251 | orchestrator | ceph-facts : Set_fact _radosgw_address to radosgw_address --------------- 0.69s 2025-09-27 00:56:10.901260 | orchestrator | ceph-facts : Check if the ceph conf exists ------------------------------ 0.69s 2025-09-27 00:56:10.901270 | orchestrator | ceph-facts : Set_fact rgw_instances ------------------------------------- 0.68s 2025-09-27 00:56:10.901279 | orchestrator | ceph-facts : Collect existed devices ------------------------------------ 0.66s 2025-09-27 00:56:10.901289 | orchestrator | ceph-facts : Set_fact monitor_name ansible_facts['hostname'] ------------ 0.57s 2025-09-27 00:56:10.901298 | orchestrator | 2025-09-27 00:56:10 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:10.901308 | orchestrator | 2025-09-27 00:56:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:10.901318 | orchestrator | 2025-09-27 00:56:10 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:10.901328 | orchestrator | 2025-09-27 00:56:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:13.959690 | orchestrator | 2025-09-27 00:56:13 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:13.961262 | orchestrator | 2025-09-27 00:56:13 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:13.962613 | orchestrator | 2025-09-27 00:56:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:13.964285 | orchestrator | 2025-09-27 00:56:13 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:13.965976 | orchestrator | 2025-09-27 00:56:13 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:13.966162 | orchestrator | 2025-09-27 00:56:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:17.008775 | orchestrator | 2025-09-27 00:56:17 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:17.010752 | orchestrator | 2025-09-27 00:56:17 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:17.012355 | orchestrator | 2025-09-27 00:56:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:17.014888 | orchestrator | 2025-09-27 00:56:17 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:17.017204 | orchestrator | 2025-09-27 00:56:17 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:17.017877 | orchestrator | 2025-09-27 00:56:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:20.059542 | orchestrator | 2025-09-27 00:56:20 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:20.060820 | orchestrator | 2025-09-27 00:56:20 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:20.062745 | orchestrator | 2025-09-27 00:56:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:20.064017 | orchestrator | 2025-09-27 00:56:20 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:20.065772 | orchestrator | 2025-09-27 00:56:20 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:20.065805 | orchestrator | 2025-09-27 00:56:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:23.110913 | orchestrator | 2025-09-27 00:56:23 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:23.112788 | orchestrator | 2025-09-27 00:56:23 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:23.113608 | orchestrator | 2025-09-27 00:56:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:23.115155 | orchestrator | 2025-09-27 00:56:23 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:23.116005 | orchestrator | 2025-09-27 00:56:23 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:23.116222 | orchestrator | 2025-09-27 00:56:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:26.168456 | orchestrator | 2025-09-27 00:56:26 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:26.168550 | orchestrator | 2025-09-27 00:56:26 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:26.172011 | orchestrator | 2025-09-27 00:56:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:26.172159 | orchestrator | 2025-09-27 00:56:26 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:26.172180 | orchestrator | 2025-09-27 00:56:26 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:26.172192 | orchestrator | 2025-09-27 00:56:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:29.218394 | orchestrator | 2025-09-27 00:56:29 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:29.220300 | orchestrator | 2025-09-27 00:56:29 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:29.222898 | orchestrator | 2025-09-27 00:56:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:29.224402 | orchestrator | 2025-09-27 00:56:29 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:29.227515 | orchestrator | 2025-09-27 00:56:29 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:29.227544 | orchestrator | 2025-09-27 00:56:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:32.277302 | orchestrator | 2025-09-27 00:56:32 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:32.279667 | orchestrator | 2025-09-27 00:56:32 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:32.282006 | orchestrator | 2025-09-27 00:56:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:32.284124 | orchestrator | 2025-09-27 00:56:32 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:32.285836 | orchestrator | 2025-09-27 00:56:32 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:32.286098 | orchestrator | 2025-09-27 00:56:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:35.333679 | orchestrator | 2025-09-27 00:56:35 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:35.335628 | orchestrator | 2025-09-27 00:56:35 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:35.336987 | orchestrator | 2025-09-27 00:56:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:35.337869 | orchestrator | 2025-09-27 00:56:35 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:35.339311 | orchestrator | 2025-09-27 00:56:35 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:35.339339 | orchestrator | 2025-09-27 00:56:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:38.391433 | orchestrator | 2025-09-27 00:56:38 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:38.392924 | orchestrator | 2025-09-27 00:56:38 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:38.395088 | orchestrator | 2025-09-27 00:56:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:38.396467 | orchestrator | 2025-09-27 00:56:38 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:38.398388 | orchestrator | 2025-09-27 00:56:38 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:38.398427 | orchestrator | 2025-09-27 00:56:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:41.456543 | orchestrator | 2025-09-27 00:56:41 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:41.457476 | orchestrator | 2025-09-27 00:56:41 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:41.459752 | orchestrator | 2025-09-27 00:56:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:41.460280 | orchestrator | 2025-09-27 00:56:41 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:41.461782 | orchestrator | 2025-09-27 00:56:41 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:41.461808 | orchestrator | 2025-09-27 00:56:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:44.511873 | orchestrator | 2025-09-27 00:56:44 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:44.513765 | orchestrator | 2025-09-27 00:56:44 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state STARTED 2025-09-27 00:56:44.518136 | orchestrator | 2025-09-27 00:56:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:44.521982 | orchestrator | 2025-09-27 00:56:44 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:44.524012 | orchestrator | 2025-09-27 00:56:44 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state STARTED 2025-09-27 00:56:44.524360 | orchestrator | 2025-09-27 00:56:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:47.576803 | orchestrator | 2025-09-27 00:56:47 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:47.577510 | orchestrator | 2025-09-27 00:56:47 | INFO  | Task 8c8495a2-5d72-42cb-9b5a-1c422acc7844 is in state SUCCESS 2025-09-27 00:56:47.580865 | orchestrator | 2025-09-27 00:56:47 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:56:47.580890 | orchestrator | 2025-09-27 00:56:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:47.580897 | orchestrator | 2025-09-27 00:56:47 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state STARTED 2025-09-27 00:56:47.580903 | orchestrator | 2025-09-27 00:56:47 | INFO  | Task 04057c8c-9289-4f8f-8b90-07072f1a3ff6 is in state SUCCESS 2025-09-27 00:56:47.581540 | orchestrator | 2025-09-27 00:56:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:50.632883 | orchestrator | 2025-09-27 00:56:50 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:50.633755 | orchestrator | 2025-09-27 00:56:50 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:56:50.635472 | orchestrator | 2025-09-27 00:56:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:50.637524 | orchestrator | 2025-09-27 00:56:50 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:56:50.638695 | orchestrator | 2025-09-27 00:56:50 | INFO  | Task 11ecdafd-ad0d-4a2f-8a90-d2bb2a334c9b is in state SUCCESS 2025-09-27 00:56:50.639686 | orchestrator | 2025-09-27 00:56:50.639710 | orchestrator | 2025-09-27 00:56:50.639721 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:56:50.639733 | orchestrator | 2025-09-27 00:56:50.639743 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:56:50.639755 | orchestrator | Saturday 27 September 2025 00:55:49 +0000 (0:00:00.280) 0:00:00.280 **** 2025-09-27 00:56:50.639766 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:56:50.639778 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:56:50.639789 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:56:50.639799 | orchestrator | 2025-09-27 00:56:50.639810 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:56:50.639821 | orchestrator | Saturday 27 September 2025 00:55:49 +0000 (0:00:00.323) 0:00:00.604 **** 2025-09-27 00:56:50.639867 | orchestrator | ok: [testbed-node-0] => (item=enable_placement_True) 2025-09-27 00:56:50.639880 | orchestrator | ok: [testbed-node-1] => (item=enable_placement_True) 2025-09-27 00:56:50.639891 | orchestrator | ok: [testbed-node-2] => (item=enable_placement_True) 2025-09-27 00:56:50.639902 | orchestrator | 2025-09-27 00:56:50.639912 | orchestrator | PLAY [Apply role placement] **************************************************** 2025-09-27 00:56:50.639923 | orchestrator | 2025-09-27 00:56:50.639934 | orchestrator | TASK [placement : include_tasks] *********************************************** 2025-09-27 00:56:50.639945 | orchestrator | Saturday 27 September 2025 00:55:50 +0000 (0:00:00.529) 0:00:01.133 **** 2025-09-27 00:56:50.639956 | orchestrator | included: /ansible/roles/placement/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:56:50.639968 | orchestrator | 2025-09-27 00:56:50.639979 | orchestrator | TASK [service-ks-register : placement | Creating services] ********************* 2025-09-27 00:56:50.639989 | orchestrator | Saturday 27 September 2025 00:55:50 +0000 (0:00:00.583) 0:00:01.717 **** 2025-09-27 00:56:50.640000 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (5 retries left). 2025-09-27 00:56:50.640011 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (4 retries left). 2025-09-27 00:56:50.640048 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (3 retries left). 2025-09-27 00:56:50.640090 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (2 retries left). 2025-09-27 00:56:50.640101 | orchestrator | FAILED - RETRYING: [testbed-node-0]: placement | Creating services (1 retries left). 2025-09-27 00:56:50.640115 | orchestrator | failed: [testbed-node-0] (item=placement (placement)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Placement Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:8780"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:8780"}], "name": "placement", "type": "placement"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:56:50.640129 | orchestrator | 2025-09-27 00:56:50.640140 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:56:50.640151 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:56:50.640163 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:56:50.640175 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:56:50.640186 | orchestrator | 2025-09-27 00:56:50.640196 | orchestrator | 2025-09-27 00:56:50.640207 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:56:50.640218 | orchestrator | Saturday 27 September 2025 00:56:44 +0000 (0:00:53.266) 0:00:54.983 **** 2025-09-27 00:56:50.640228 | orchestrator | =============================================================================== 2025-09-27 00:56:50.640239 | orchestrator | service-ks-register : placement | Creating services -------------------- 53.27s 2025-09-27 00:56:50.640249 | orchestrator | placement : include_tasks ----------------------------------------------- 0.58s 2025-09-27 00:56:50.640260 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.53s 2025-09-27 00:56:50.640270 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.32s 2025-09-27 00:56:50.640281 | orchestrator | 2025-09-27 00:56:50.640292 | orchestrator | 2025-09-27 00:56:50.640302 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:56:50.640312 | orchestrator | 2025-09-27 00:56:50.640323 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:56:50.640334 | orchestrator | Saturday 27 September 2025 00:55:49 +0000 (0:00:00.285) 0:00:00.285 **** 2025-09-27 00:56:50.640344 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:56:50.640355 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:56:50.640366 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:56:50.640376 | orchestrator | 2025-09-27 00:56:50.640387 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:56:50.640397 | orchestrator | Saturday 27 September 2025 00:55:50 +0000 (0:00:00.303) 0:00:00.588 **** 2025-09-27 00:56:50.640408 | orchestrator | ok: [testbed-node-0] => (item=enable_magnum_True) 2025-09-27 00:56:50.640419 | orchestrator | ok: [testbed-node-1] => (item=enable_magnum_True) 2025-09-27 00:56:50.640438 | orchestrator | ok: [testbed-node-2] => (item=enable_magnum_True) 2025-09-27 00:56:50.640449 | orchestrator | 2025-09-27 00:56:50.640460 | orchestrator | PLAY [Apply role magnum] ******************************************************* 2025-09-27 00:56:50.640470 | orchestrator | 2025-09-27 00:56:50.640481 | orchestrator | TASK [magnum : include_tasks] ************************************************** 2025-09-27 00:56:50.640503 | orchestrator | Saturday 27 September 2025 00:55:50 +0000 (0:00:00.493) 0:00:01.082 **** 2025-09-27 00:56:50.640514 | orchestrator | included: /ansible/roles/magnum/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:56:50.640525 | orchestrator | 2025-09-27 00:56:50.640536 | orchestrator | TASK [service-ks-register : magnum | Creating services] ************************ 2025-09-27 00:56:50.640556 | orchestrator | Saturday 27 September 2025 00:55:51 +0000 (0:00:00.544) 0:00:01.627 **** 2025-09-27 00:56:50.640566 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (5 retries left). 2025-09-27 00:56:50.640577 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (4 retries left). 2025-09-27 00:56:50.640587 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (3 retries left). 2025-09-27 00:56:50.640598 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (2 retries left). 2025-09-27 00:56:50.640608 | orchestrator | FAILED - RETRYING: [testbed-node-0]: magnum | Creating services (1 retries left). 2025-09-27 00:56:50.640621 | orchestrator | failed: [testbed-node-0] (item=magnum (container-infra)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Container Infrastructure Management Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9511/v1"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9511/v1"}], "name": "magnum", "type": "container-infra"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:56:50.640636 | orchestrator | 2025-09-27 00:56:50.640647 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:56:50.640657 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:56:50.640668 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:56:50.640679 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:56:50.640690 | orchestrator | 2025-09-27 00:56:50.640700 | orchestrator | 2025-09-27 00:56:50.640711 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:56:50.640722 | orchestrator | Saturday 27 September 2025 00:56:44 +0000 (0:00:53.163) 0:00:54.791 **** 2025-09-27 00:56:50.640732 | orchestrator | =============================================================================== 2025-09-27 00:56:50.640742 | orchestrator | service-ks-register : magnum | Creating services ----------------------- 53.16s 2025-09-27 00:56:50.640753 | orchestrator | magnum : include_tasks -------------------------------------------------- 0.54s 2025-09-27 00:56:50.640764 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.49s 2025-09-27 00:56:50.640774 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.30s 2025-09-27 00:56:50.640785 | orchestrator | 2025-09-27 00:56:50.640795 | orchestrator | 2025-09-27 00:56:50.640806 | orchestrator | PLAY [Copy ceph keys to the configuration repository] ************************** 2025-09-27 00:56:50.640816 | orchestrator | 2025-09-27 00:56:50.640827 | orchestrator | TASK [Check if ceph keys exist] ************************************************ 2025-09-27 00:56:50.640837 | orchestrator | Saturday 27 September 2025 00:56:13 +0000 (0:00:00.155) 0:00:00.155 **** 2025-09-27 00:56:50.640847 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2025-09-27 00:56:50.640858 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.640869 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.640879 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2025-09-27 00:56:50.640890 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.640900 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2025-09-27 00:56:50.640911 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2025-09-27 00:56:50.640929 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2025-09-27 00:56:50.640940 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2025-09-27 00:56:50.640950 | orchestrator | 2025-09-27 00:56:50.640961 | orchestrator | TASK [Fetch all ceph keys] ***************************************************** 2025-09-27 00:56:50.640972 | orchestrator | Saturday 27 September 2025 00:56:18 +0000 (0:00:04.735) 0:00:04.890 **** 2025-09-27 00:56:50.640982 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.admin.keyring) 2025-09-27 00:56:50.640998 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641009 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641020 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder-backup.keyring) 2025-09-27 00:56:50.641036 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641047 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.nova.keyring) 2025-09-27 00:56:50.641074 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.glance.keyring) 2025-09-27 00:56:50.641085 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.gnocchi.keyring) 2025-09-27 00:56:50.641096 | orchestrator | ok: [testbed-manager -> testbed-node-0(192.168.16.10)] => (item=ceph.client.manila.keyring) 2025-09-27 00:56:50.641106 | orchestrator | 2025-09-27 00:56:50.641117 | orchestrator | TASK [Create share directory] ************************************************** 2025-09-27 00:56:50.641128 | orchestrator | Saturday 27 September 2025 00:56:22 +0000 (0:00:04.337) 0:00:09.228 **** 2025-09-27 00:56:50.641139 | orchestrator | changed: [testbed-manager -> localhost] 2025-09-27 00:56:50.641149 | orchestrator | 2025-09-27 00:56:50.641160 | orchestrator | TASK [Write ceph keys to the share directory] ********************************** 2025-09-27 00:56:50.641170 | orchestrator | Saturday 27 September 2025 00:56:23 +0000 (0:00:00.976) 0:00:10.204 **** 2025-09-27 00:56:50.641181 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.admin.keyring) 2025-09-27 00:56:50.641192 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641202 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641213 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.cinder-backup.keyring) 2025-09-27 00:56:50.641224 | orchestrator | ok: [testbed-manager -> localhost] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641234 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.nova.keyring) 2025-09-27 00:56:50.641245 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.glance.keyring) 2025-09-27 00:56:50.641256 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.gnocchi.keyring) 2025-09-27 00:56:50.641266 | orchestrator | changed: [testbed-manager -> localhost] => (item=ceph.client.manila.keyring) 2025-09-27 00:56:50.641277 | orchestrator | 2025-09-27 00:56:50.641288 | orchestrator | TASK [Check if target directories exist] *************************************** 2025-09-27 00:56:50.641298 | orchestrator | Saturday 27 September 2025 00:56:37 +0000 (0:00:13.469) 0:00:23.674 **** 2025-09-27 00:56:50.641309 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/infrastructure/files/ceph) 2025-09-27 00:56:50.641320 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-volume) 2025-09-27 00:56:50.641330 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-backup) 2025-09-27 00:56:50.641341 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/cinder/cinder-backup) 2025-09-27 00:56:50.641359 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/nova) 2025-09-27 00:56:50.641370 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/nova) 2025-09-27 00:56:50.641381 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/glance) 2025-09-27 00:56:50.641391 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/gnocchi) 2025-09-27 00:56:50.641402 | orchestrator | ok: [testbed-manager] => (item=/opt/configuration/environments/kolla/files/overlays/manila) 2025-09-27 00:56:50.641413 | orchestrator | 2025-09-27 00:56:50.641423 | orchestrator | TASK [Write ceph keys to the configuration directory] ************************** 2025-09-27 00:56:50.641434 | orchestrator | Saturday 27 September 2025 00:56:40 +0000 (0:00:03.093) 0:00:26.767 **** 2025-09-27 00:56:50.641445 | orchestrator | changed: [testbed-manager] => (item=ceph.client.admin.keyring) 2025-09-27 00:56:50.641456 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641466 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641477 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder-backup.keyring) 2025-09-27 00:56:50.641488 | orchestrator | changed: [testbed-manager] => (item=ceph.client.cinder.keyring) 2025-09-27 00:56:50.641498 | orchestrator | changed: [testbed-manager] => (item=ceph.client.nova.keyring) 2025-09-27 00:56:50.641509 | orchestrator | changed: [testbed-manager] => (item=ceph.client.glance.keyring) 2025-09-27 00:56:50.641520 | orchestrator | changed: [testbed-manager] => (item=ceph.client.gnocchi.keyring) 2025-09-27 00:56:50.641530 | orchestrator | changed: [testbed-manager] => (item=ceph.client.manila.keyring) 2025-09-27 00:56:50.641541 | orchestrator | 2025-09-27 00:56:50.641552 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:56:50.641568 | orchestrator | testbed-manager : ok=6  changed=3  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:56:50.641579 | orchestrator | 2025-09-27 00:56:50.641589 | orchestrator | 2025-09-27 00:56:50.641600 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:56:50.641611 | orchestrator | Saturday 27 September 2025 00:56:47 +0000 (0:00:06.790) 0:00:33.558 **** 2025-09-27 00:56:50.641627 | orchestrator | =============================================================================== 2025-09-27 00:56:50.641638 | orchestrator | Write ceph keys to the share directory --------------------------------- 13.47s 2025-09-27 00:56:50.641649 | orchestrator | Write ceph keys to the configuration directory -------------------------- 6.79s 2025-09-27 00:56:50.641660 | orchestrator | Check if ceph keys exist ------------------------------------------------ 4.74s 2025-09-27 00:56:50.641670 | orchestrator | Fetch all ceph keys ----------------------------------------------------- 4.34s 2025-09-27 00:56:50.641681 | orchestrator | Check if target directories exist --------------------------------------- 3.09s 2025-09-27 00:56:50.641692 | orchestrator | Create share directory -------------------------------------------------- 0.98s 2025-09-27 00:56:50.641772 | orchestrator | 2025-09-27 00:56:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:53.698603 | orchestrator | 2025-09-27 00:56:53 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:53.700446 | orchestrator | 2025-09-27 00:56:53 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:56:53.702194 | orchestrator | 2025-09-27 00:56:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:53.704091 | orchestrator | 2025-09-27 00:56:53 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:56:53.704302 | orchestrator | 2025-09-27 00:56:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:56.751548 | orchestrator | 2025-09-27 00:56:56 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:56.752857 | orchestrator | 2025-09-27 00:56:56 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:56:56.756345 | orchestrator | 2025-09-27 00:56:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:56.757717 | orchestrator | 2025-09-27 00:56:56 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:56:56.758189 | orchestrator | 2025-09-27 00:56:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:56:59.808203 | orchestrator | 2025-09-27 00:56:59 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:56:59.810943 | orchestrator | 2025-09-27 00:56:59 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:56:59.813733 | orchestrator | 2025-09-27 00:56:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:56:59.815342 | orchestrator | 2025-09-27 00:56:59 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:56:59.815873 | orchestrator | 2025-09-27 00:56:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:02.865851 | orchestrator | 2025-09-27 00:57:02 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:02.867606 | orchestrator | 2025-09-27 00:57:02 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:02.870207 | orchestrator | 2025-09-27 00:57:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:02.873834 | orchestrator | 2025-09-27 00:57:02 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:02.873863 | orchestrator | 2025-09-27 00:57:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:05.915508 | orchestrator | 2025-09-27 00:57:05 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:05.915729 | orchestrator | 2025-09-27 00:57:05 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:05.917583 | orchestrator | 2025-09-27 00:57:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:05.920256 | orchestrator | 2025-09-27 00:57:05 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:05.920292 | orchestrator | 2025-09-27 00:57:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:08.967488 | orchestrator | 2025-09-27 00:57:08 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:08.968005 | orchestrator | 2025-09-27 00:57:08 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:08.969472 | orchestrator | 2025-09-27 00:57:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:08.970868 | orchestrator | 2025-09-27 00:57:08 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:08.971118 | orchestrator | 2025-09-27 00:57:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:12.021105 | orchestrator | 2025-09-27 00:57:12 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:12.021938 | orchestrator | 2025-09-27 00:57:12 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:12.022631 | orchestrator | 2025-09-27 00:57:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:12.024185 | orchestrator | 2025-09-27 00:57:12 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:12.024233 | orchestrator | 2025-09-27 00:57:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:15.075247 | orchestrator | 2025-09-27 00:57:15 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:15.076566 | orchestrator | 2025-09-27 00:57:15 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:15.077357 | orchestrator | 2025-09-27 00:57:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:15.078397 | orchestrator | 2025-09-27 00:57:15 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:15.078421 | orchestrator | 2025-09-27 00:57:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:18.123447 | orchestrator | 2025-09-27 00:57:18 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:18.125667 | orchestrator | 2025-09-27 00:57:18 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:18.126817 | orchestrator | 2025-09-27 00:57:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:18.128429 | orchestrator | 2025-09-27 00:57:18 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:18.128775 | orchestrator | 2025-09-27 00:57:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:21.193204 | orchestrator | 2025-09-27 00:57:21 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:21.194541 | orchestrator | 2025-09-27 00:57:21 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:21.196128 | orchestrator | 2025-09-27 00:57:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:21.198268 | orchestrator | 2025-09-27 00:57:21 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:21.198378 | orchestrator | 2025-09-27 00:57:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:24.248017 | orchestrator | 2025-09-27 00:57:24 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:24.249195 | orchestrator | 2025-09-27 00:57:24 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:24.251102 | orchestrator | 2025-09-27 00:57:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:24.253725 | orchestrator | 2025-09-27 00:57:24 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:24.253963 | orchestrator | 2025-09-27 00:57:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:27.298606 | orchestrator | 2025-09-27 00:57:27 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:27.301663 | orchestrator | 2025-09-27 00:57:27 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:27.303285 | orchestrator | 2025-09-27 00:57:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:27.304946 | orchestrator | 2025-09-27 00:57:27 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:27.305165 | orchestrator | 2025-09-27 00:57:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:30.358708 | orchestrator | 2025-09-27 00:57:30 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:30.361154 | orchestrator | 2025-09-27 00:57:30 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:30.363591 | orchestrator | 2025-09-27 00:57:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:30.367773 | orchestrator | 2025-09-27 00:57:30 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:30.368577 | orchestrator | 2025-09-27 00:57:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:33.420270 | orchestrator | 2025-09-27 00:57:33 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:33.421652 | orchestrator | 2025-09-27 00:57:33 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:33.424305 | orchestrator | 2025-09-27 00:57:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:33.427851 | orchestrator | 2025-09-27 00:57:33 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:33.428024 | orchestrator | 2025-09-27 00:57:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:36.478495 | orchestrator | 2025-09-27 00:57:36 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:36.479833 | orchestrator | 2025-09-27 00:57:36 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:36.481469 | orchestrator | 2025-09-27 00:57:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:36.484248 | orchestrator | 2025-09-27 00:57:36 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:36.484279 | orchestrator | 2025-09-27 00:57:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:39.538555 | orchestrator | 2025-09-27 00:57:39 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:39.541794 | orchestrator | 2025-09-27 00:57:39 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:39.544575 | orchestrator | 2025-09-27 00:57:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:39.547788 | orchestrator | 2025-09-27 00:57:39 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:39.547817 | orchestrator | 2025-09-27 00:57:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:42.599557 | orchestrator | 2025-09-27 00:57:42 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:42.604271 | orchestrator | 2025-09-27 00:57:42 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state STARTED 2025-09-27 00:57:42.606159 | orchestrator | 2025-09-27 00:57:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:42.607859 | orchestrator | 2025-09-27 00:57:42 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:42.607964 | orchestrator | 2025-09-27 00:57:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:45.647934 | orchestrator | 2025-09-27 00:57:45 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:45.648791 | orchestrator | 2025-09-27 00:57:45 | INFO  | Task 6c346132-8a0e-4327-a35f-bf1063d1df57 is in state SUCCESS 2025-09-27 00:57:45.655261 | orchestrator | 2025-09-27 00:57:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:45.657864 | orchestrator | 2025-09-27 00:57:45 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:45.657888 | orchestrator | 2025-09-27 00:57:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:48.703570 | orchestrator | 2025-09-27 00:57:48 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:48.705254 | orchestrator | 2025-09-27 00:57:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:48.707260 | orchestrator | 2025-09-27 00:57:48 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state STARTED 2025-09-27 00:57:48.707339 | orchestrator | 2025-09-27 00:57:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:51.777890 | orchestrator | 2025-09-27 00:57:51 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:51.777987 | orchestrator | 2025-09-27 00:57:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:57:51.778657 | orchestrator | 2025-09-27 00:57:51 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:57:51.779259 | orchestrator | 2025-09-27 00:57:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:51.780563 | orchestrator | 2025-09-27 00:57:51 | INFO  | Task 4cb1be4c-8822-4c91-a78f-b47ea914c630 is in state SUCCESS 2025-09-27 00:57:51.780908 | orchestrator | 2025-09-27 00:57:51.780931 | orchestrator | 2025-09-27 00:57:51.780960 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:57:51.780973 | orchestrator | 2025-09-27 00:57:51.780984 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:57:51.780996 | orchestrator | Saturday 27 September 2025 00:56:48 +0000 (0:00:00.260) 0:00:00.260 **** 2025-09-27 00:57:51.781007 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:57:51.781018 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:57:51.781029 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:57:51.781062 | orchestrator | 2025-09-27 00:57:51.781074 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:57:51.781085 | orchestrator | Saturday 27 September 2025 00:56:48 +0000 (0:00:00.272) 0:00:00.533 **** 2025-09-27 00:57:51.781096 | orchestrator | ok: [testbed-node-0] => (item=enable_octavia_True) 2025-09-27 00:57:51.781107 | orchestrator | ok: [testbed-node-1] => (item=enable_octavia_True) 2025-09-27 00:57:51.781118 | orchestrator | ok: [testbed-node-2] => (item=enable_octavia_True) 2025-09-27 00:57:51.781129 | orchestrator | 2025-09-27 00:57:51.781140 | orchestrator | PLAY [Apply role octavia] ****************************************************** 2025-09-27 00:57:51.781151 | orchestrator | 2025-09-27 00:57:51.781162 | orchestrator | TASK [octavia : include_tasks] ************************************************* 2025-09-27 00:57:51.781173 | orchestrator | Saturday 27 September 2025 00:56:49 +0000 (0:00:00.415) 0:00:00.948 **** 2025-09-27 00:57:51.781184 | orchestrator | included: /ansible/roles/octavia/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 00:57:51.781196 | orchestrator | 2025-09-27 00:57:51.781207 | orchestrator | TASK [service-ks-register : octavia | Creating services] *********************** 2025-09-27 00:57:51.781217 | orchestrator | Saturday 27 September 2025 00:56:49 +0000 (0:00:00.527) 0:00:01.476 **** 2025-09-27 00:57:51.781229 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (5 retries left). 2025-09-27 00:57:51.781239 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (4 retries left). 2025-09-27 00:57:51.781250 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (3 retries left). 2025-09-27 00:57:51.781261 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (2 retries left). 2025-09-27 00:57:51.781272 | orchestrator | FAILED - RETRYING: [testbed-node-0]: octavia | Creating services (1 retries left). 2025-09-27 00:57:51.781285 | orchestrator | failed: [testbed-node-0] (item=octavia (load-balancer)) => {"ansible_loop_var": "item", "attempts": 5, "changed": false, "item": {"description": "Octavia Load Balancing Service", "endpoints": [{"interface": "internal", "url": "https://api-int.testbed.osism.xyz:9876"}, {"interface": "public", "url": "https://api.testbed.osism.xyz:9876"}], "name": "octavia", "type": "load-balancer"}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:57:51.781324 | orchestrator | 2025-09-27 00:57:51.781336 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:57:51.781346 | orchestrator | testbed-node-0 : ok=3  changed=0 unreachable=0 failed=1  skipped=0 rescued=0 ignored=0 2025-09-27 00:57:51.781359 | orchestrator | testbed-node-1 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:57:51.781371 | orchestrator | testbed-node-2 : ok=3  changed=0 unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 00:57:51.781382 | orchestrator | 2025-09-27 00:57:51.781393 | orchestrator | 2025-09-27 00:57:51.781404 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:57:51.781415 | orchestrator | Saturday 27 September 2025 00:57:42 +0000 (0:00:53.146) 0:00:54.622 **** 2025-09-27 00:57:51.781426 | orchestrator | =============================================================================== 2025-09-27 00:57:51.781436 | orchestrator | service-ks-register : octavia | Creating services ---------------------- 53.15s 2025-09-27 00:57:51.781447 | orchestrator | octavia : include_tasks ------------------------------------------------- 0.53s 2025-09-27 00:57:51.781458 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.42s 2025-09-27 00:57:51.781468 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.27s 2025-09-27 00:57:51.781479 | orchestrator | 2025-09-27 00:57:51.781490 | orchestrator | 2025-09-27 00:57:51.781500 | orchestrator | PLAY [Apply role cephclient] *************************************************** 2025-09-27 00:57:51.781511 | orchestrator | 2025-09-27 00:57:51.781522 | orchestrator | TASK [osism.services.cephclient : Include container tasks] ********************* 2025-09-27 00:57:51.781534 | orchestrator | Saturday 27 September 2025 00:56:51 +0000 (0:00:00.235) 0:00:00.235 **** 2025-09-27 00:57:51.781547 | orchestrator | included: /usr/share/ansible/collections/ansible_collections/osism/services/roles/cephclient/tasks/container.yml for testbed-manager 2025-09-27 00:57:51.781559 | orchestrator | 2025-09-27 00:57:51.781571 | orchestrator | TASK [osism.services.cephclient : Create required directories] ***************** 2025-09-27 00:57:51.781584 | orchestrator | Saturday 27 September 2025 00:56:52 +0000 (0:00:00.227) 0:00:00.462 **** 2025-09-27 00:57:51.781597 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/configuration) 2025-09-27 00:57:51.781609 | orchestrator | changed: [testbed-manager] => (item=/opt/cephclient/data) 2025-09-27 00:57:51.781621 | orchestrator | ok: [testbed-manager] => (item=/opt/cephclient) 2025-09-27 00:57:51.781633 | orchestrator | 2025-09-27 00:57:51.781646 | orchestrator | TASK [osism.services.cephclient : Copy configuration files] ******************** 2025-09-27 00:57:51.781675 | orchestrator | Saturday 27 September 2025 00:56:53 +0000 (0:00:01.235) 0:00:01.698 **** 2025-09-27 00:57:51.781689 | orchestrator | changed: [testbed-manager] => (item={'src': 'ceph.conf.j2', 'dest': '/opt/cephclient/configuration/ceph.conf'}) 2025-09-27 00:57:51.781702 | orchestrator | 2025-09-27 00:57:51.781714 | orchestrator | TASK [osism.services.cephclient : Copy keyring file] *************************** 2025-09-27 00:57:51.781726 | orchestrator | Saturday 27 September 2025 00:56:54 +0000 (0:00:01.125) 0:00:02.823 **** 2025-09-27 00:57:51.781738 | orchestrator | changed: [testbed-manager] 2025-09-27 00:57:51.781750 | orchestrator | 2025-09-27 00:57:51.781762 | orchestrator | TASK [osism.services.cephclient : Copy docker-compose.yml file] **************** 2025-09-27 00:57:51.781775 | orchestrator | Saturday 27 September 2025 00:56:55 +0000 (0:00:00.978) 0:00:03.802 **** 2025-09-27 00:57:51.781787 | orchestrator | changed: [testbed-manager] 2025-09-27 00:57:51.781799 | orchestrator | 2025-09-27 00:57:51.781809 | orchestrator | TASK [osism.services.cephclient : Manage cephclient service] ******************* 2025-09-27 00:57:51.781820 | orchestrator | Saturday 27 September 2025 00:56:56 +0000 (0:00:00.995) 0:00:04.797 **** 2025-09-27 00:57:51.781847 | orchestrator | FAILED - RETRYING: [testbed-manager]: Manage cephclient service (10 retries left). 2025-09-27 00:57:51.781858 | orchestrator | ok: [testbed-manager] 2025-09-27 00:57:51.781868 | orchestrator | 2025-09-27 00:57:51.781879 | orchestrator | TASK [osism.services.cephclient : Copy wrapper scripts] ************************ 2025-09-27 00:57:51.781890 | orchestrator | Saturday 27 September 2025 00:57:38 +0000 (0:00:42.423) 0:00:47.220 **** 2025-09-27 00:57:51.781900 | orchestrator | changed: [testbed-manager] => (item=ceph) 2025-09-27 00:57:51.781911 | orchestrator | changed: [testbed-manager] => (item=ceph-authtool) 2025-09-27 00:57:51.781922 | orchestrator | changed: [testbed-manager] => (item=rados) 2025-09-27 00:57:51.781933 | orchestrator | changed: [testbed-manager] => (item=radosgw-admin) 2025-09-27 00:57:51.781943 | orchestrator | changed: [testbed-manager] => (item=rbd) 2025-09-27 00:57:51.781954 | orchestrator | 2025-09-27 00:57:51.781965 | orchestrator | TASK [osism.services.cephclient : Remove old wrapper scripts] ****************** 2025-09-27 00:57:51.781975 | orchestrator | Saturday 27 September 2025 00:57:43 +0000 (0:00:04.282) 0:00:51.503 **** 2025-09-27 00:57:51.781986 | orchestrator | ok: [testbed-manager] => (item=crushtool) 2025-09-27 00:57:51.781997 | orchestrator | 2025-09-27 00:57:51.782007 | orchestrator | TASK [osism.services.cephclient : Include package tasks] *********************** 2025-09-27 00:57:51.782073 | orchestrator | Saturday 27 September 2025 00:57:43 +0000 (0:00:00.500) 0:00:52.003 **** 2025-09-27 00:57:51.782087 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:57:51.782097 | orchestrator | 2025-09-27 00:57:51.782108 | orchestrator | TASK [osism.services.cephclient : Include rook task] *************************** 2025-09-27 00:57:51.782119 | orchestrator | Saturday 27 September 2025 00:57:43 +0000 (0:00:00.133) 0:00:52.137 **** 2025-09-27 00:57:51.782129 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:57:51.782140 | orchestrator | 2025-09-27 00:57:51.782150 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Restart cephclient service] ******* 2025-09-27 00:57:51.782161 | orchestrator | Saturday 27 September 2025 00:57:44 +0000 (0:00:00.322) 0:00:52.460 **** 2025-09-27 00:57:51.782171 | orchestrator | changed: [testbed-manager] 2025-09-27 00:57:51.782182 | orchestrator | 2025-09-27 00:57:51.782193 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Ensure that all containers are up] *** 2025-09-27 00:57:51.782203 | orchestrator | Saturday 27 September 2025 00:57:46 +0000 (0:00:02.356) 0:00:54.816 **** 2025-09-27 00:57:51.782214 | orchestrator | changed: [testbed-manager] 2025-09-27 00:57:51.782225 | orchestrator | 2025-09-27 00:57:51.782238 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Wait for an healthy service] ****** 2025-09-27 00:57:51.782249 | orchestrator | Saturday 27 September 2025 00:57:47 +0000 (0:00:00.872) 0:00:55.689 **** 2025-09-27 00:57:51.782260 | orchestrator | changed: [testbed-manager] 2025-09-27 00:57:51.782270 | orchestrator | 2025-09-27 00:57:51.782281 | orchestrator | RUNNING HANDLER [osism.services.cephclient : Copy bash completion scripts] ***** 2025-09-27 00:57:51.782291 | orchestrator | Saturday 27 September 2025 00:57:48 +0000 (0:00:00.681) 0:00:56.370 **** 2025-09-27 00:57:51.782302 | orchestrator | ok: [testbed-manager] => (item=ceph) 2025-09-27 00:57:51.782313 | orchestrator | ok: [testbed-manager] => (item=rados) 2025-09-27 00:57:51.782324 | orchestrator | ok: [testbed-manager] => (item=radosgw-admin) 2025-09-27 00:57:51.782334 | orchestrator | ok: [testbed-manager] => (item=rbd) 2025-09-27 00:57:51.782345 | orchestrator | 2025-09-27 00:57:51.782355 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:57:51.782366 | orchestrator | testbed-manager : ok=12  changed=8  unreachable=0 failed=0 skipped=2  rescued=0 ignored=0 2025-09-27 00:57:51.782377 | orchestrator | 2025-09-27 00:57:51.782388 | orchestrator | 2025-09-27 00:57:51.782398 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:57:51.782409 | orchestrator | Saturday 27 September 2025 00:57:49 +0000 (0:00:01.568) 0:00:57.939 **** 2025-09-27 00:57:51.782419 | orchestrator | =============================================================================== 2025-09-27 00:57:51.782438 | orchestrator | osism.services.cephclient : Manage cephclient service ------------------ 42.42s 2025-09-27 00:57:51.782448 | orchestrator | osism.services.cephclient : Copy wrapper scripts ------------------------ 4.28s 2025-09-27 00:57:51.782459 | orchestrator | osism.services.cephclient : Restart cephclient service ------------------ 2.36s 2025-09-27 00:57:51.782469 | orchestrator | osism.services.cephclient : Copy bash completion scripts ---------------- 1.57s 2025-09-27 00:57:51.782480 | orchestrator | osism.services.cephclient : Create required directories ----------------- 1.24s 2025-09-27 00:57:51.782491 | orchestrator | osism.services.cephclient : Copy configuration files -------------------- 1.13s 2025-09-27 00:57:51.782501 | orchestrator | osism.services.cephclient : Copy docker-compose.yml file ---------------- 1.00s 2025-09-27 00:57:51.782512 | orchestrator | osism.services.cephclient : Copy keyring file --------------------------- 0.98s 2025-09-27 00:57:51.782522 | orchestrator | osism.services.cephclient : Ensure that all containers are up ----------- 0.87s 2025-09-27 00:57:51.782546 | orchestrator | osism.services.cephclient : Wait for an healthy service ----------------- 0.68s 2025-09-27 00:57:51.782558 | orchestrator | osism.services.cephclient : Remove old wrapper scripts ------------------ 0.50s 2025-09-27 00:57:51.782568 | orchestrator | osism.services.cephclient : Include rook task --------------------------- 0.32s 2025-09-27 00:57:51.782579 | orchestrator | osism.services.cephclient : Include container tasks --------------------- 0.23s 2025-09-27 00:57:51.782589 | orchestrator | osism.services.cephclient : Include package tasks ----------------------- 0.13s 2025-09-27 00:57:51.783240 | orchestrator | 2025-09-27 00:57:51 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:57:51.783266 | orchestrator | 2025-09-27 00:57:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:54.858926 | orchestrator | 2025-09-27 00:57:54 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:54.859741 | orchestrator | 2025-09-27 00:57:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:57:54.860956 | orchestrator | 2025-09-27 00:57:54 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:57:54.862147 | orchestrator | 2025-09-27 00:57:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:54.863201 | orchestrator | 2025-09-27 00:57:54 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:57:54.863226 | orchestrator | 2025-09-27 00:57:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:57:57.910478 | orchestrator | 2025-09-27 00:57:57 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:57:57.913390 | orchestrator | 2025-09-27 00:57:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:57:57.914381 | orchestrator | 2025-09-27 00:57:57 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:57:57.916645 | orchestrator | 2025-09-27 00:57:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:57:57.918323 | orchestrator | 2025-09-27 00:57:57 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:57:57.918490 | orchestrator | 2025-09-27 00:57:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:00.963762 | orchestrator | 2025-09-27 00:58:00 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:00.966395 | orchestrator | 2025-09-27 00:58:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:00.969340 | orchestrator | 2025-09-27 00:58:00 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:00.971658 | orchestrator | 2025-09-27 00:58:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:00.973718 | orchestrator | 2025-09-27 00:58:00 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:00.973957 | orchestrator | 2025-09-27 00:58:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:04.028183 | orchestrator | 2025-09-27 00:58:04 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:04.028285 | orchestrator | 2025-09-27 00:58:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:04.030809 | orchestrator | 2025-09-27 00:58:04 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:04.030837 | orchestrator | 2025-09-27 00:58:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:04.032476 | orchestrator | 2025-09-27 00:58:04 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:04.032498 | orchestrator | 2025-09-27 00:58:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:07.066889 | orchestrator | 2025-09-27 00:58:07 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:07.067988 | orchestrator | 2025-09-27 00:58:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:07.068612 | orchestrator | 2025-09-27 00:58:07 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:07.069773 | orchestrator | 2025-09-27 00:58:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:07.071209 | orchestrator | 2025-09-27 00:58:07 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:07.071236 | orchestrator | 2025-09-27 00:58:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:10.118332 | orchestrator | 2025-09-27 00:58:10 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:10.118698 | orchestrator | 2025-09-27 00:58:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:10.119670 | orchestrator | 2025-09-27 00:58:10 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:10.120813 | orchestrator | 2025-09-27 00:58:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:10.121756 | orchestrator | 2025-09-27 00:58:10 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:10.121779 | orchestrator | 2025-09-27 00:58:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:13.157812 | orchestrator | 2025-09-27 00:58:13 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:13.158405 | orchestrator | 2025-09-27 00:58:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:13.159407 | orchestrator | 2025-09-27 00:58:13 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:13.160413 | orchestrator | 2025-09-27 00:58:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:13.161491 | orchestrator | 2025-09-27 00:58:13 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:13.161596 | orchestrator | 2025-09-27 00:58:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:16.213871 | orchestrator | 2025-09-27 00:58:16 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:16.213972 | orchestrator | 2025-09-27 00:58:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:16.214124 | orchestrator | 2025-09-27 00:58:16 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:16.214175 | orchestrator | 2025-09-27 00:58:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:16.214187 | orchestrator | 2025-09-27 00:58:16 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:16.214198 | orchestrator | 2025-09-27 00:58:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:19.265469 | orchestrator | 2025-09-27 00:58:19 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:19.266100 | orchestrator | 2025-09-27 00:58:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:19.266982 | orchestrator | 2025-09-27 00:58:19 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:19.268068 | orchestrator | 2025-09-27 00:58:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:19.269279 | orchestrator | 2025-09-27 00:58:19 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:19.269305 | orchestrator | 2025-09-27 00:58:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:22.378202 | orchestrator | 2025-09-27 00:58:22 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:22.378295 | orchestrator | 2025-09-27 00:58:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:22.378309 | orchestrator | 2025-09-27 00:58:22 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:22.378319 | orchestrator | 2025-09-27 00:58:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:22.378329 | orchestrator | 2025-09-27 00:58:22 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:22.378339 | orchestrator | 2025-09-27 00:58:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:25.807305 | orchestrator | 2025-09-27 00:58:25 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:25.846609 | orchestrator | 2025-09-27 00:58:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:25.846644 | orchestrator | 2025-09-27 00:58:25 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:25.846657 | orchestrator | 2025-09-27 00:58:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:25.846668 | orchestrator | 2025-09-27 00:58:25 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:25.846700 | orchestrator | 2025-09-27 00:58:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:28.874440 | orchestrator | 2025-09-27 00:58:28 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:28.875829 | orchestrator | 2025-09-27 00:58:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:28.876217 | orchestrator | 2025-09-27 00:58:28 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:28.877680 | orchestrator | 2025-09-27 00:58:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:28.878332 | orchestrator | 2025-09-27 00:58:28 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:28.878356 | orchestrator | 2025-09-27 00:58:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:31.926253 | orchestrator | 2025-09-27 00:58:31 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:31.927552 | orchestrator | 2025-09-27 00:58:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:31.929306 | orchestrator | 2025-09-27 00:58:31 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:31.930817 | orchestrator | 2025-09-27 00:58:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:31.932750 | orchestrator | 2025-09-27 00:58:31 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:31.932892 | orchestrator | 2025-09-27 00:58:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:34.966091 | orchestrator | 2025-09-27 00:58:34 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:34.967884 | orchestrator | 2025-09-27 00:58:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:34.969946 | orchestrator | 2025-09-27 00:58:34 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:34.971956 | orchestrator | 2025-09-27 00:58:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:34.973982 | orchestrator | 2025-09-27 00:58:34 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:34.974005 | orchestrator | 2025-09-27 00:58:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:38.020215 | orchestrator | 2025-09-27 00:58:38 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:38.020459 | orchestrator | 2025-09-27 00:58:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:38.021303 | orchestrator | 2025-09-27 00:58:38 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:38.022253 | orchestrator | 2025-09-27 00:58:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:38.022741 | orchestrator | 2025-09-27 00:58:38 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:38.022763 | orchestrator | 2025-09-27 00:58:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:41.076939 | orchestrator | 2025-09-27 00:58:41 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:41.077141 | orchestrator | 2025-09-27 00:58:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:41.079029 | orchestrator | 2025-09-27 00:58:41 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:41.080535 | orchestrator | 2025-09-27 00:58:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:41.082295 | orchestrator | 2025-09-27 00:58:41 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:41.082322 | orchestrator | 2025-09-27 00:58:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:44.118470 | orchestrator | 2025-09-27 00:58:44 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:44.118861 | orchestrator | 2025-09-27 00:58:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:44.119709 | orchestrator | 2025-09-27 00:58:44 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:44.120734 | orchestrator | 2025-09-27 00:58:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:44.121726 | orchestrator | 2025-09-27 00:58:44 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:44.122384 | orchestrator | 2025-09-27 00:58:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:47.156774 | orchestrator | 2025-09-27 00:58:47 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:47.158645 | orchestrator | 2025-09-27 00:58:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:47.160596 | orchestrator | 2025-09-27 00:58:47 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:47.162408 | orchestrator | 2025-09-27 00:58:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:47.163878 | orchestrator | 2025-09-27 00:58:47 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:47.163923 | orchestrator | 2025-09-27 00:58:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:50.214189 | orchestrator | 2025-09-27 00:58:50 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:50.215940 | orchestrator | 2025-09-27 00:58:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:50.218579 | orchestrator | 2025-09-27 00:58:50 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:50.220645 | orchestrator | 2025-09-27 00:58:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:50.222803 | orchestrator | 2025-09-27 00:58:50 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:50.223226 | orchestrator | 2025-09-27 00:58:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:53.276535 | orchestrator | 2025-09-27 00:58:53 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:53.278438 | orchestrator | 2025-09-27 00:58:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:53.280452 | orchestrator | 2025-09-27 00:58:53 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:53.282616 | orchestrator | 2025-09-27 00:58:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:53.284302 | orchestrator | 2025-09-27 00:58:53 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:53.284332 | orchestrator | 2025-09-27 00:58:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:56.319191 | orchestrator | 2025-09-27 00:58:56 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:56.320324 | orchestrator | 2025-09-27 00:58:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:56.320364 | orchestrator | 2025-09-27 00:58:56 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:56.321147 | orchestrator | 2025-09-27 00:58:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:56.321864 | orchestrator | 2025-09-27 00:58:56 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:56.322128 | orchestrator | 2025-09-27 00:58:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:58:59.397656 | orchestrator | 2025-09-27 00:58:59 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:58:59.398182 | orchestrator | 2025-09-27 00:58:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:58:59.399327 | orchestrator | 2025-09-27 00:58:59 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:58:59.400286 | orchestrator | 2025-09-27 00:58:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:58:59.401327 | orchestrator | 2025-09-27 00:58:59 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:58:59.401406 | orchestrator | 2025-09-27 00:58:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:02.435302 | orchestrator | 2025-09-27 00:59:02 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:02.438141 | orchestrator | 2025-09-27 00:59:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:02.442286 | orchestrator | 2025-09-27 00:59:02 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:59:02.443825 | orchestrator | 2025-09-27 00:59:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:02.445574 | orchestrator | 2025-09-27 00:59:02 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:59:02.445609 | orchestrator | 2025-09-27 00:59:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:05.482941 | orchestrator | 2025-09-27 00:59:05 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:05.484168 | orchestrator | 2025-09-27 00:59:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:05.486994 | orchestrator | 2025-09-27 00:59:05 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:59:05.489179 | orchestrator | 2025-09-27 00:59:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:05.491511 | orchestrator | 2025-09-27 00:59:05 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:59:05.491547 | orchestrator | 2025-09-27 00:59:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:08.530345 | orchestrator | 2025-09-27 00:59:08 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:08.531647 | orchestrator | 2025-09-27 00:59:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:08.533465 | orchestrator | 2025-09-27 00:59:08 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state STARTED 2025-09-27 00:59:08.535096 | orchestrator | 2025-09-27 00:59:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:08.536270 | orchestrator | 2025-09-27 00:59:08 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:59:08.536294 | orchestrator | 2025-09-27 00:59:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:11.584616 | orchestrator | 2025-09-27 00:59:11 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:11.585757 | orchestrator | 2025-09-27 00:59:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:11.591242 | orchestrator | 2025-09-27 00:59:11 | INFO  | Task 837ad8ab-f13b-4706-84f7-9eb83f6ce45a is in state SUCCESS 2025-09-27 00:59:11.597881 | orchestrator | 2025-09-27 00:59:11.597926 | orchestrator | 2025-09-27 00:59:11.597938 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 00:59:11.597989 | orchestrator | 2025-09-27 00:59:11.598001 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 00:59:11.598011 | orchestrator | Saturday 27 September 2025 00:57:54 +0000 (0:00:00.329) 0:00:00.329 **** 2025-09-27 00:59:11.598135 | orchestrator | ok: [testbed-manager] 2025-09-27 00:59:11.598147 | orchestrator | ok: [testbed-node-0] 2025-09-27 00:59:11.598157 | orchestrator | ok: [testbed-node-1] 2025-09-27 00:59:11.598190 | orchestrator | ok: [testbed-node-2] 2025-09-27 00:59:11.598199 | orchestrator | ok: [testbed-node-3] 2025-09-27 00:59:11.598209 | orchestrator | ok: [testbed-node-4] 2025-09-27 00:59:11.598218 | orchestrator | ok: [testbed-node-5] 2025-09-27 00:59:11.598228 | orchestrator | 2025-09-27 00:59:11.598238 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 00:59:11.598247 | orchestrator | Saturday 27 September 2025 00:57:55 +0000 (0:00:00.955) 0:00:01.284 **** 2025-09-27 00:59:11.598258 | orchestrator | ok: [testbed-manager] => (item=enable_prometheus_True) 2025-09-27 00:59:11.598307 | orchestrator | ok: [testbed-node-0] => (item=enable_prometheus_True) 2025-09-27 00:59:11.598319 | orchestrator | ok: [testbed-node-1] => (item=enable_prometheus_True) 2025-09-27 00:59:11.598329 | orchestrator | ok: [testbed-node-2] => (item=enable_prometheus_True) 2025-09-27 00:59:11.598338 | orchestrator | ok: [testbed-node-3] => (item=enable_prometheus_True) 2025-09-27 00:59:11.598347 | orchestrator | ok: [testbed-node-4] => (item=enable_prometheus_True) 2025-09-27 00:59:11.598357 | orchestrator | ok: [testbed-node-5] => (item=enable_prometheus_True) 2025-09-27 00:59:11.598366 | orchestrator | 2025-09-27 00:59:11.598376 | orchestrator | PLAY [Apply role prometheus] *************************************************** 2025-09-27 00:59:11.598385 | orchestrator | 2025-09-27 00:59:11.598395 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2025-09-27 00:59:11.598557 | orchestrator | Saturday 27 September 2025 00:57:56 +0000 (0:00:01.014) 0:00:02.299 **** 2025-09-27 00:59:11.598570 | orchestrator | included: /ansible/roles/prometheus/tasks/deploy.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:59:11.598582 | orchestrator | 2025-09-27 00:59:11.598593 | orchestrator | TASK [prometheus : Ensuring config directories exist] ************************** 2025-09-27 00:59:11.598604 | orchestrator | Saturday 27 September 2025 00:57:57 +0000 (0:00:01.519) 0:00:03.818 **** 2025-09-27 00:59:11.598628 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.598644 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-27 00:59:11.598656 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.598670 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.598749 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.598785 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.598797 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.598867 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.598886 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.598916 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.598926 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.598943 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.598962 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.598972 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.598982 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.598993 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599008 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-27 00:59:11.599023 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599060 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599078 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599089 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599099 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599110 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599125 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599136 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599146 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599163 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599178 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599189 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599200 | orchestrator | 2025-09-27 00:59:11.599210 | orchestrator | TASK [prometheus : include_tasks] ********************************************** 2025-09-27 00:59:11.599220 | orchestrator | Saturday 27 September 2025 00:58:00 +0000 (0:00:03.092) 0:00:06.911 **** 2025-09-27 00:59:11.599241 | orchestrator | included: /ansible/roles/prometheus/tasks/copy-certs.yml for testbed-manager, testbed-node-0, testbed-node-1, testbed-node-2, testbed-node-3, testbed-node-4, testbed-node-5 2025-09-27 00:59:11.599252 | orchestrator | 2025-09-27 00:59:11.599261 | orchestrator | TASK [service-cert-copy : prometheus | Copying over extra CA certificates] ***** 2025-09-27 00:59:11.599271 | orchestrator | Saturday 27 September 2025 00:58:02 +0000 (0:00:01.394) 0:00:08.305 **** 2025-09-27 00:59:11.599286 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-27 00:59:11.599297 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.599430 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.599443 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.599460 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.599470 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.599480 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.599490 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.599505 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599515 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599532 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599542 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599557 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599569 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599580 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599590 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599605 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599621 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599631 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599641 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599657 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-27 00:59:11.599669 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599679 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599693 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599709 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.599719 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.599729 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.600559 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.600582 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.600593 | orchestrator | 2025-09-27 00:59:11.600603 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS certificate] *** 2025-09-27 00:59:11.600613 | orchestrator | Saturday 27 September 2025 00:58:07 +0000 (0:00:05.710) 0:00:14.016 **** 2025-09-27 00:59:11.600623 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-27 00:59:11.600641 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.600660 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.600672 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-27 00:59:11.600690 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600700 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.600711 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600721 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600741 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.600752 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600762 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:59:11.600773 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.600783 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600798 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600809 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.600819 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600829 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.600849 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600860 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.600870 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600880 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.600890 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.600900 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.600910 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.600926 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.600936 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.600947 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.600963 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.600973 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.600987 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.600998 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601008 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.601018 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.601056 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601074 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601085 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.601095 | orchestrator | 2025-09-27 00:59:11.601104 | orchestrator | TASK [service-cert-copy : prometheus | Copying over backend internal TLS key] *** 2025-09-27 00:59:11.601114 | orchestrator | Saturday 27 September 2025 00:58:09 +0000 (0:00:01.480) 0:00:15.497 **** 2025-09-27 00:59:11.601135 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}})  2025-09-27 00:59:11.601145 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.601160 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601172 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}})  2025-09-27 00:59:11.601185 | orchestrator | skipping: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601203 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.601215 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601232 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601244 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601260 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601272 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.601283 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601295 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601313 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601331 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601343 | orchestrator | skipping: [testbed-manager] 2025-09-27 00:59:11.601354 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.601365 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.601376 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.601387 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601403 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601414 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601426 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}})  2025-09-27 00:59:11.601437 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.601452 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.601470 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601482 | orchestrator | skipping: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601493 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.601505 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.601521 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601531 | orchestrator | skipping: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601541 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.601551 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}})  2025-09-27 00:59:11.601561 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601584 | orchestrator | skipping: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}})  2025-09-27 00:59:11.601595 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.601604 | orchestrator | 2025-09-27 00:59:11.601614 | orchestrator | TASK [prometheus : Copying over config.json files] ***************************** 2025-09-27 00:59:11.601624 | orchestrator | Saturday 27 September 2025 00:58:11 +0000 (0:00:02.030) 0:00:17.527 **** 2025-09-27 00:59:11.601634 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.601644 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-server', 'value': {'container_name': 'prometheus_server', 'group': 'prometheus', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-v2-server:2024.2', 'volumes': ['/etc/kolla/prometheus-server/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'prometheus_v2:/var/lib/prometheus', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'prometheus_server': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9091', 'active_passive': True}, 'prometheus_server_external': {'enabled': False, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9091', 'listen_port': '9091', 'active_passive': True}}}}) 2025-09-27 00:59:11.601659 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.601669 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.601679 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.601689 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.601710 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.601721 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.601731 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.601741 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.601751 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601762 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.601772 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601787 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601830 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601841 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.601852 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.601862 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601876 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.601887 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-alertmanager', 'value': {'container_name': 'prometheus_alertmanager', 'group': 'prometheus-alertmanager', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-alertmanager:2024.2', 'volumes': ['/etc/kolla/prometheus-alertmanager/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', 'prometheus:/var/lib/prometheus'], 'dimensions': {}, 'haproxy': {'prometheus_alertmanager': {'enabled': True, 'mode': 'http', 'external': False, 'port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}, 'prometheus_alertmanager_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '9093', 'listen_port': '9093', 'auth_user': 'admin', 'auth_pass': 'BXo64rLqmF7bTbWLDOnNJlD0qJ4BSTWocNHVNKU2', 'active_passive': True}}}}) 2025-09-27 00:59:11.601904 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601919 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601930 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601940 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601950 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.601967 | orchestrator | changed: [testbed-manager] => (item={'key': 'prometheus-blackbox-exporter', 'value': {'container_name': 'prometheus_blackbox_exporter', 'group': 'prometheus-blackbox-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-blackbox-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-blackbox-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.601978 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.601988 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.602007 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.602085 | orchestrator | 2025-09-27 00:59:11.602099 | orchestrator | TASK [prometheus : Find custom prometheus alert rules files] ******************* 2025-09-27 00:59:11.602109 | orchestrator | Saturday 27 September 2025 00:58:17 +0000 (0:00:06.157) 0:00:23.685 **** 2025-09-27 00:59:11.602119 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-27 00:59:11.602129 | orchestrator | 2025-09-27 00:59:11.602139 | orchestrator | TASK [prometheus : Copying over custom prometheus alert rules files] *********** 2025-09-27 00:59:11.602154 | orchestrator | Saturday 27 September 2025 00:58:18 +0000 (0:00:01.042) 0:00:24.728 **** 2025-09-27 00:59:11.602165 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1090401, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5266464, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602176 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1090401, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5266464, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602186 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1090401, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5266464, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602202 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1090528, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5523808, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602213 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1090401, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5266464, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602230 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1090528, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5523808, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602245 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1090395, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5192118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602256 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1090401, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5266464, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602266 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1090401, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5266464, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.602276 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1090528, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5523808, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602290 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/fluentd-aggregator.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 996, 'inode': 1090401, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5266464, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602306 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1090395, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5192118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602317 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1090528, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5523808, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602353 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1090528, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5523808, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602364 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1090434, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602374 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1090395, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5192118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602384 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1090434, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602398 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1090395, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5192118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602415 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1090395, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5192118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602425 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1090392, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5162113, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602449 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1090528, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5523808, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602459 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1090434, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602469 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1090395, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5192118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602480 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1090434, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602494 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1090392, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5162113, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602511 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1090420, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5285294, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602521 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/prometheus.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12980, 'inode': 1090528, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5523808, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.602531 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1090420, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5285294, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602548 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1090434, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602558 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1090434, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602568 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1090392, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5162113, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602588 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1090392, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5162113, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602599 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1090431, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602609 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1090420, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5285294, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602619 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1090392, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5162113, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602635 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1090392, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5162113, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602645 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1090420, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5285294, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602656 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1090431, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602676 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/ceph.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 55956, 'inode': 1090395, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5192118, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.602687 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1090431, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602697 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1090431, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602707 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1090420, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5285294, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602722 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1090420, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5285294, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602733 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1090424, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602743 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1090424, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602766 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1090424, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602776 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1090431, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602787 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1090399, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5197542, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602796 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1090424, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602812 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1090431, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602822 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1090399, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5197542, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602832 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090523, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5512428, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602852 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1090399, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5197542, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602863 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1090399, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5197542, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602899 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090386, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5154493, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602910 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1090424, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602926 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090523, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5512428, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602936 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1090424, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602952 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1090546, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5548663, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602967 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090523, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5512428, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602977 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1090399, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5197542, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602987 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090386, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5154493, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.602997 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1090438, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5506456, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603013 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090523, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5512428, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603024 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/openstack.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12293, 'inode': 1090434, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603056 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090523, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5512428, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603071 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090393, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5178647, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603082 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1090546, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5548663, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603092 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1090399, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5197542, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603102 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090386, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5154493, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603118 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1090438, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5506456, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603128 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090386, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5154493, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603144 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090386, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5154493, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603158 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090393, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5178647, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603168 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1090546, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5548663, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603179 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1090390, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5158193, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603189 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1090390, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5158193, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603205 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1090438, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5506456, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603221 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090523, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5512428, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603231 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1090546, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5548663, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603245 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1090428, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5293958, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603256 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1090546, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5548663, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603266 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1090426, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603276 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/cadvisor.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3900, 'inode': 1090392, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5162113, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603292 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1090438, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5506456, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603307 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1090428, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5293958, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603317 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090386, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5154493, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603327 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090393, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5178647, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603342 | orchestrator | skipping: [testbed-node-0] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1090541, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5539503, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603352 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.603362 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1090438, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5506456, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603372 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1090426, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603388 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1090390, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5158193, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603404 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090393, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5178647, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603414 | orchestrator | skipping: [testbed-node-3] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1090541, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5539503, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603423 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.603438 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090393, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5178647, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603448 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1090546, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5548663, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603458 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1090428, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5293958, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603468 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1090390, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5158193, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603494 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1090438, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5506456, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603505 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1090390, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5158193, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603515 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1090426, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603525 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1090428, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5293958, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603539 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090393, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5178647, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603549 | orchestrator | skipping: [testbed-node-2] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1090541, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5539503, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603559 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.603569 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/haproxy.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7933, 'inode': 1090420, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5285294, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603590 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1090426, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603601 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1090428, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5293958, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603610 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1090390, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5158193, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603621 | orchestrator | skipping: [testbed-node-1] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1090541, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5539503, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603635 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.603645 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1090426, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603655 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1090428, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5293958, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603665 | orchestrator | skipping: [testbed-node-5] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1090541, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5539503, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603682 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.603697 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1090426, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603707 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/node.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 13522, 'inode': 1090431, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5306864, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603718 | orchestrator | skipping: [testbed-node-4] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1090541, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5539503, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False})  2025-09-27 00:59:11.603728 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.603742 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/hardware.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5593, 'inode': 1090424, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603752 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/elasticsearch.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5987, 'inode': 1090399, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5197542, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603762 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/prometheus.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090523, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5512428, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603778 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/alertmanager.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090386, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5154493, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603793 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/redfish.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 334, 'inode': 1090546, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5548663, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603804 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/prometheus-extra.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 7408, 'inode': 1090438, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5506456, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603814 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/ceph.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3, 'inode': 1090393, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5178647, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603828 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/alertmanager.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 5051, 'inode': 1090390, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5158193, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603839 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/node.rec.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 2309, 'inode': 1090428, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5293958, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603849 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/mysql.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3792, 'inode': 1090426, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5288224, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603864 | orchestrator | changed: [testbed-manager] => (item={'path': '/operations/prometheus/rabbitmq.rules', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 3539, 'inode': 1090541, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5539503, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}) 2025-09-27 00:59:11.603874 | orchestrator | 2025-09-27 00:59:11.603884 | orchestrator | TASK [prometheus : Find prometheus common config overrides] ******************** 2025-09-27 00:59:11.603894 | orchestrator | Saturday 27 September 2025 00:58:41 +0000 (0:00:23.120) 0:00:47.849 **** 2025-09-27 00:59:11.603909 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-27 00:59:11.603919 | orchestrator | 2025-09-27 00:59:11.603928 | orchestrator | TASK [prometheus : Find prometheus host config overrides] ********************** 2025-09-27 00:59:11.603938 | orchestrator | Saturday 27 September 2025 00:58:42 +0000 (0:00:00.712) 0:00:48.561 **** 2025-09-27 00:59:11.603947 | orchestrator | [WARNING]: Skipped 2025-09-27 00:59:11.603958 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.603968 | orchestrator | manager/prometheus.yml.d' path due to this access issue: 2025-09-27 00:59:11.603978 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.603987 | orchestrator | manager/prometheus.yml.d' is not a directory 2025-09-27 00:59:11.603997 | orchestrator | ok: [testbed-manager -> localhost] 2025-09-27 00:59:11.604006 | orchestrator | [WARNING]: Skipped 2025-09-27 00:59:11.604016 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604026 | orchestrator | node-0/prometheus.yml.d' path due to this access issue: 2025-09-27 00:59:11.604049 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604059 | orchestrator | node-0/prometheus.yml.d' is not a directory 2025-09-27 00:59:11.604069 | orchestrator | [WARNING]: Skipped 2025-09-27 00:59:11.604079 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604088 | orchestrator | node-1/prometheus.yml.d' path due to this access issue: 2025-09-27 00:59:11.604098 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604107 | orchestrator | node-1/prometheus.yml.d' is not a directory 2025-09-27 00:59:11.604117 | orchestrator | [WARNING]: Skipped 2025-09-27 00:59:11.604126 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604136 | orchestrator | node-2/prometheus.yml.d' path due to this access issue: 2025-09-27 00:59:11.604145 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604155 | orchestrator | node-2/prometheus.yml.d' is not a directory 2025-09-27 00:59:11.604164 | orchestrator | [WARNING]: Skipped 2025-09-27 00:59:11.604174 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604183 | orchestrator | node-3/prometheus.yml.d' path due to this access issue: 2025-09-27 00:59:11.604193 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604203 | orchestrator | node-3/prometheus.yml.d' is not a directory 2025-09-27 00:59:11.604212 | orchestrator | [WARNING]: Skipped 2025-09-27 00:59:11.604228 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604238 | orchestrator | node-4/prometheus.yml.d' path due to this access issue: 2025-09-27 00:59:11.604247 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604261 | orchestrator | node-4/prometheus.yml.d' is not a directory 2025-09-27 00:59:11.604271 | orchestrator | [WARNING]: Skipped 2025-09-27 00:59:11.604280 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604306 | orchestrator | node-5/prometheus.yml.d' path due to this access issue: 2025-09-27 00:59:11.604316 | orchestrator | '/opt/configuration/environments/kolla/files/overlays/prometheus/testbed- 2025-09-27 00:59:11.604325 | orchestrator | node-5/prometheus.yml.d' is not a directory 2025-09-27 00:59:11.604335 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 00:59:11.604344 | orchestrator | ok: [testbed-node-1 -> localhost] 2025-09-27 00:59:11.604353 | orchestrator | ok: [testbed-node-2 -> localhost] 2025-09-27 00:59:11.604363 | orchestrator | ok: [testbed-node-3 -> localhost] 2025-09-27 00:59:11.604372 | orchestrator | ok: [testbed-node-4 -> localhost] 2025-09-27 00:59:11.604382 | orchestrator | ok: [testbed-node-5 -> localhost] 2025-09-27 00:59:11.604391 | orchestrator | 2025-09-27 00:59:11.604401 | orchestrator | TASK [prometheus : Copying over prometheus config file] ************************ 2025-09-27 00:59:11.604410 | orchestrator | Saturday 27 September 2025 00:58:44 +0000 (0:00:01.602) 0:00:50.164 **** 2025-09-27 00:59:11.604419 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-27 00:59:11.604429 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-27 00:59:11.604439 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.604448 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-27 00:59:11.604458 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.604468 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.604477 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-27 00:59:11.604487 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.604496 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-27 00:59:11.604506 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.604515 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus.yml.j2)  2025-09-27 00:59:11.604525 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.604667 | orchestrator | fatal: [testbed-manager]: FAILED! => {"msg": "{{ prometheus_blackbox_exporter_endpoints_default | selectattr('enabled', 'true') | map(attribute='endpoints') | flatten | union(prometheus_blackbox_exporter_endpoints_custom) | unique | select | list }}: [{'endpoints': ['aodh:os_endpoint:{{ aodh_public_endpoint }}', \"{{ ('aodh_internal:os_endpoint:' + aodh_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_aodh | bool }}'}, {'endpoints': ['barbican:os_endpoint:{{ barbican_public_endpoint }}', \"{{ ('barbican_internal:os_endpoint:' + barbican_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_barbican | bool }}'}, {'endpoints': ['blazar:os_endpoint:{{ blazar_public_base_endpoint }}', \"{{ ('blazar_internal:os_endpoint:' + blazar_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_blazar | bool }}'}, {'endpoints': ['ceph_rgw:http_2xx:{{ ceph_rgw_public_base_endpoint }}', \"{{ ('ceph_rgw_internal:http_2xx:' + ceph_rgw_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_ceph_rgw | bool }}'}, {'endpoints': ['cinder:os_endpoint:{{ cinder_public_base_endpoint }}', \"{{ ('cinder_internal:os_endpoint:' + cinder_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_cinder | bool }}'}, {'endpoints': ['cloudkitty:os_endpoint:{{ cloudkitty_public_endpoint }}', \"{{ ('cloudkitty_internal:os_endpoint:' + cloudkitty_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_cloudkitty | bool }}'}, {'endpoints': ['designate:os_endpoint:{{ designate_public_endpoint }}', \"{{ ('designate_internal:os_endpoint:' + designate_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_designate | bool }}'}, {'endpoints': ['glance:os_endpoint:{{ glance_public_endpoint }}', \"{{ ('glance_internal:os_endpoint:' + glance_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_glance | bool }}'}, {'endpoints': ['gnocchi:os_endpoint:{{ gnocchi_public_endpoint }}', \"{{ ('gnocchi_internal:os_endpoint:' + gnocchi_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_gnocchi | bool }}'}, {'endpoints': ['heat:os_endpoint:{{ heat_public_base_endpoint }}', \"{{ ('heat_internal:os_endpoint:' + heat_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\", 'heat_cfn:os_endpoint:{{ heat_cfn_public_base_endpoint }}', \"{{ ('heat_cfn_internal:os_endpoint:' + heat_cfn_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_heat | bool }}'}, {'endpoints': ['horizon:http_2xx:{{ horizon_public_endpoint }}', \"{{ ('horizon_internal:http_2xx:' + horizon_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_horizon | bool }}'}, {'endpoints': ['ironic:os_endpoint:{{ ironic_public_endpoint }}', \"{{ ('ironic_internal:os_endpoint:' + ironic_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\", 'ironic_inspector:os_endpoint:{{ ironic_inspector_public_endpoint }}', \"{{ ('ironic_inspector_internal:os_endpoint:' + ironic_inspector_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_ironic | bool }}'}, {'endpoints': ['keystone:os_endpoint:{{ keystone_public_url }}', \"{{ ('keystone_internal:os_endpoint:' + keystone_internal_url) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_keystone | bool }}'}, {'endpoints': ['magnum:os_endpoint:{{ magnum_public_base_endpoint }}', \"{{ ('magnum_internal:os_endpoint:' + magnum_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_magnum | bool }}'}, {'endpoints': ['manila:os_endpoint:{{ manila_public_base_endpoint }}', \"{{ ('manila_internal:os_endpoint:' + manila_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_manila | bool }}'}, {'endpoints': ['masakari:os_endpoint:{{ masakari_public_endpoint }}', \"{{ ('masakari_internal:os_endpoint:' + masakari_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_masakari | bool }}'}, {'endpoints': ['mistral:os_endpoint:{{ mistral_public_base_endpoint }}', \"{{ ('mistral_internal:os_endpoint:' + mistral_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_mistral | bool }}'}, {'endpoints': ['neutron:os_endpoint:{{ neutron_public_endpoint }}', \"{{ ('neutron_internal:os_endpoint:' + neutron_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_neutron | bool }}'}, {'endpoints': ['nova:os_endpoint:{{ nova_public_base_endpoint }}', \"{{ ('nova_internal:os_endpoint:' + nova_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_nova | bool }}'}, {'endpoints': ['octavia:os_endpoint:{{ octavia_public_endpoint }}', \"{{ ('octavia_internal:os_endpoint:' + octavia_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_octavia | bool }}'}, {'endpoints': ['placement:os_endpoint:{{ placement_public_endpoint }}', \"{{ ('placement_internal:os_endpoint:' + placement_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_placement | bool }}'}, {'endpoints': ['skyline_apiserver:os_endpoint:{{ skyline_apiserver_public_endpoint }}', \"{{ ('skyline_apiserver_internal:os_endpoint:' + skyline_apiserver_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\", 'skyline_console:os_endpoint:{{ skyline_console_public_endpoint }}', \"{{ ('skyline_console_internal:os_endpoint:' + skyline_console_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_skyline | bool }}'}, {'endpoints': ['swift:os_endpoint:{{ swift_public_base_endpoint }}', \"{{ ('swift_internal:os_endpoint:' + swift_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_swift | bool }}'}, {'endpoints': ['tacker:os_endpoint:{{ tacker_public_endpoint }}', \"{{ ('tacker_internal:os_endpoint:' + tacker_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_tacker | bool }}'}, {'endpoints': ['trove:os_endpoint:{{ trove_public_base_endpoint }}', \"{{ ('trove_internal:os_endpoint:' + trove_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_trove | bool }}'}, {'endpoints': ['venus:os_endpoint:{{ venus_public_endpoint }}', \"{{ ('venus_internal:os_endpoint:' + venus_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_venus | bool }}'}, {'endpoints': ['watcher:os_endpoint:{{ watcher_public_endpoint }}', \"{{ ('watcher_internal:os_endpoint:' + watcher_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_watcher | bool }}'}, {'endpoints': ['zun:os_endpoint:{{ zun_public_base_endpoint }}', \"{{ ('zun_internal:os_endpoint:' + zun_internal_base_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_zun | bool }}'}, {'endpoints': \"{% set etcd_endpoints = [] %}{% for host in groups.get('etcd', []) %}{{ etcd_endpoints.append('etcd_' + host + ':http_2xx:' + hostvars[host]['etcd_protocol'] + '://' + ('api' | kolla_address(host) | put_address_in_context('url')) + ':' + hostvars[host]['etcd_client_port'] + '/metrics')}}{% endfor %}{{ etcd_endpoints }}\", 'enabled': '{{ enable_etcd | bool }}'}, {'endpoints': ['grafana:http_2xx:{{ grafana_public_endpoint }}', \"{{ ('grafana_internal:http_2xx:' + grafana_internal_endpoint) if not kolla_same_external_internal_vip | bool }}\"], 'enabled': '{{ enable_grafana | bool }}'}, {'endpoints': ['opensearch:http_2xx:{{ opensearch_internal_endpoint }}'], 'enabled': '{{ enable_opensearch | bool }}'}, {'endpoints': ['opensearch_dashboards:http_2xx_opensearch_dashboards:{{ opensearch_dashboards_internal_endpoint }}/api/status'], 'enabled': '{{ enable_opensearch_dashboards | bool }}'}, {'endpoints': ['opensearch_dashboards_external:http_2xx_opensearch_dashboards:{{ opensearch_dashboards_external_endpoint }}/api/status'], 'enabled': '{{ enable_opensearch_dashboards_external | bool }}'}, {'endpoints': ['prometheus:http_2xx_prometheus:{{ prometheus_public_endpoint if enable_prometheus_server_external else prometheus_internal_endpoint }}/-/healthy'], 'enabled': '{{ enable_prometheus | bool }}'}, {'endpoints': ['prometheus_alertmanager:http_2xx_alertmanager:{{ prometheus_alertmanager_public_endpoint if enable_prometheus_alertmanager_external else prometheus_alertmanager_internal_endpoint }}'], 'enabled': '{{ enable_prometheus_alertmanager | bool }}'}, {'endpoints': \"{% set rabbitmq_endpoints = [] %}{% for host in groups.get('rabbitmq', []) %}{{ rabbitmq_endpoints.append('rabbitmq_' + host + (':tls_connect:' if rabbitmq_enable_tls | bool else ':tcp_connect:') + ('api' | kolla_address(host) | put_address_in_context('url')) + ':' + hostvars[host]['rabbitmq_port'] ) }}{% endfor %}{{ rabbitmq_endpoints }}\", 'enabled': '{{ enable_rabbitmq | bool }}'}, {'endpoints': \"{% set redis_endpoints = [] %}{% for host in groups.get('redis', []) %}{{ redis_endpoints.append('redis_' + host + ':tcp_connect:' + ('api' | kolla_address(host) | put_address_in_context('url')) + ':' + hostvars[host]['redis_port']) }}{% endfor %}{{ redis_endpoints }}\", 'enabled': '{{ enable_redis | bool }}'}]: 'swift_public_base_endpoint' is undefined"} 2025-09-27 00:59:11.604707 | orchestrator | 2025-09-27 00:59:11.604717 | orchestrator | TASK [prometheus : Copying over prometheus web config file] ******************** 2025-09-27 00:59:11.604726 | orchestrator | Saturday 27 September 2025 00:58:53 +0000 (0:00:09.857) 0:01:00.021 **** 2025-09-27 00:59:11.604736 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-27 00:59:11.604746 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.604755 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-27 00:59:11.604765 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.604775 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-27 00:59:11.604784 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.604794 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-27 00:59:11.604804 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.604813 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-27 00:59:11.604827 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.604837 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-web.yml.j2)  2025-09-27 00:59:11.604846 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.604856 | orchestrator | 2025-09-27 00:59:11.604865 | orchestrator | TASK [prometheus : Copying over prometheus alertmanager config file] *********** 2025-09-27 00:59:11.604875 | orchestrator | Saturday 27 September 2025 00:58:55 +0000 (0:00:01.466) 0:01:01.487 **** 2025-09-27 00:59:11.604885 | orchestrator | skipping: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-27 00:59:11.604895 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.604904 | orchestrator | skipping: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-27 00:59:11.604914 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.604923 | orchestrator | skipping: [testbed-node-3] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-27 00:59:11.604933 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.604943 | orchestrator | skipping: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-27 00:59:11.604952 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.604962 | orchestrator | skipping: [testbed-node-4] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-27 00:59:11.604972 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.604981 | orchestrator | skipping: [testbed-node-5] => (item=/opt/configuration/environments/kolla/files/overlays/prometheus/prometheus-alertmanager.yml)  2025-09-27 00:59:11.604991 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.605001 | orchestrator | 2025-09-27 00:59:11.605010 | orchestrator | TASK [prometheus : Find custom Alertmanager alert notification templates] ****** 2025-09-27 00:59:11.605020 | orchestrator | Saturday 27 September 2025 00:58:56 +0000 (0:00:01.159) 0:01:02.647 **** 2025-09-27 00:59:11.605080 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 00:59:11.605092 | orchestrator | 2025-09-27 00:59:11.605101 | orchestrator | TASK [prometheus : Copying over custom Alertmanager alert notification templates] *** 2025-09-27 00:59:11.605111 | orchestrator | Saturday 27 September 2025 00:58:57 +0000 (0:00:00.646) 0:01:03.294 **** 2025-09-27 00:59:11.605121 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.605131 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.605147 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.605156 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.605166 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.605180 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.605188 | orchestrator | 2025-09-27 00:59:11.605196 | orchestrator | TASK [prometheus : Copying over my.cnf for mysqld_exporter] ******************** 2025-09-27 00:59:11.605204 | orchestrator | Saturday 27 September 2025 00:58:58 +0000 (0:00:00.740) 0:01:04.035 **** 2025-09-27 00:59:11.605212 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.605220 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.605228 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.605235 | orchestrator | changed: [testbed-node-0] 2025-09-27 00:59:11.605243 | orchestrator | changed: [testbed-node-1] 2025-09-27 00:59:11.605251 | orchestrator | changed: [testbed-node-2] 2025-09-27 00:59:11.605259 | orchestrator | 2025-09-27 00:59:11.605267 | orchestrator | TASK [prometheus : Copying cloud config file for openstack exporter] *********** 2025-09-27 00:59:11.605275 | orchestrator | Saturday 27 September 2025 00:58:59 +0000 (0:00:01.757) 0:01:05.792 **** 2025-09-27 00:59:11.605283 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-27 00:59:11.605290 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.605298 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-27 00:59:11.605306 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.605314 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-27 00:59:11.605322 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.605330 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-27 00:59:11.605338 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.605346 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-27 00:59:11.605353 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.605361 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/clouds.yml.j2)  2025-09-27 00:59:11.605369 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.605377 | orchestrator | 2025-09-27 00:59:11.605385 | orchestrator | TASK [prometheus : Copying config file for blackbox exporter] ****************** 2025-09-27 00:59:11.605393 | orchestrator | Saturday 27 September 2025 00:59:01 +0000 (0:00:01.280) 0:01:07.072 **** 2025-09-27 00:59:11.605401 | orchestrator | skipping: [testbed-node-0] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-27 00:59:11.605409 | orchestrator | skipping: [testbed-node-1] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-27 00:59:11.605417 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.605425 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.605433 | orchestrator | skipping: [testbed-node-2] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-27 00:59:11.605441 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.605453 | orchestrator | skipping: [testbed-node-3] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-27 00:59:11.605461 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.605469 | orchestrator | skipping: [testbed-node-4] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-27 00:59:11.605477 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.605484 | orchestrator | skipping: [testbed-node-5] => (item=/ansible/roles/prometheus/templates/prometheus-blackbox-exporter.yml.j2)  2025-09-27 00:59:11.605492 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.605500 | orchestrator | 2025-09-27 00:59:11.605508 | orchestrator | TASK [prometheus : Find extra prometheus server config files] ****************** 2025-09-27 00:59:11.605516 | orchestrator | Saturday 27 September 2025 00:59:02 +0000 (0:00:01.070) 0:01:08.143 **** 2025-09-27 00:59:11.605532 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.605540 | orchestrator | 2025-09-27 00:59:11.605548 | orchestrator | TASK [prometheus : Create subdirectories for extra config files] *************** 2025-09-27 00:59:11.605555 | orchestrator | Saturday 27 September 2025 00:59:02 +0000 (0:00:00.700) 0:01:08.844 **** 2025-09-27 00:59:11.605563 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.605571 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.605579 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.605587 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.605595 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.605603 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.605610 | orchestrator | 2025-09-27 00:59:11.605618 | orchestrator | TASK [prometheus : Template extra prometheus server config files] ************** 2025-09-27 00:59:11.605626 | orchestrator | Saturday 27 September 2025 00:59:03 +0000 (0:00:00.523) 0:01:09.367 **** 2025-09-27 00:59:11.605634 | orchestrator | skipping: [testbed-node-0] 2025-09-27 00:59:11.605642 | orchestrator | skipping: [testbed-node-1] 2025-09-27 00:59:11.605650 | orchestrator | skipping: [testbed-node-2] 2025-09-27 00:59:11.605658 | orchestrator | skipping: [testbed-node-3] 2025-09-27 00:59:11.605665 | orchestrator | skipping: [testbed-node-4] 2025-09-27 00:59:11.605673 | orchestrator | skipping: [testbed-node-5] 2025-09-27 00:59:11.605681 | orchestrator | 2025-09-27 00:59:11.605689 | orchestrator | TASK [prometheus : Check prometheus containers] ******************************** 2025-09-27 00:59:11.605697 | orchestrator | Saturday 27 September 2025 00:59:03 +0000 (0:00:00.597) 0:01:09.965 **** 2025-09-27 00:59:11.605710 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.605719 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.605727 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.605736 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.605744 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.605761 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-node-exporter', 'value': {'container_name': 'prometheus_node_exporter', 'group': 'prometheus-node-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-node-exporter:2024.2', 'pid_mode': 'host', 'volumes': ['/etc/kolla/prometheus-node-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/host:ro,rslave'], 'dimensions': {}}}) 2025-09-27 00:59:11.605770 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605778 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605790 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605799 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-mysqld-exporter', 'value': {'container_name': 'prometheus_mysqld_exporter', 'group': 'prometheus-mysqld-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-mysqld-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-mysqld-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605807 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605815 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605835 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605843 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605852 | orchestrator | changed: [testbed-node-3] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605860 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-memcached-exporter', 'value': {'container_name': 'prometheus_memcached_exporter', 'group': 'prometheus-memcached-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-memcached-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-memcached-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605868 | orchestrator | changed: [testbed-node-5] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605881 | orchestrator | changed: [testbed-node-4] => (item={'key': 'prometheus-libvirt-exporter', 'value': {'container_name': 'prometheus_libvirt_exporter', 'group': 'prometheus-libvirt-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-libvirt-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-libvirt-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', '/run/libvirt:/run/libvirt:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605889 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605897 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605915 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-cadvisor', 'value': {'container_name': 'prometheus_cadvisor', 'group': 'prometheus-cadvisor', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-cadvisor:2024.2', 'volumes': ['/etc/kolla/prometheus-cadvisor/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/', '/:/rootfs:ro', '/var/run:/var/run:rw', '/sys:/sys:ro', '/var/lib/docker/:/var/lib/docker:ro', '/dev/disk/:/dev/disk:ro'], 'dimensions': {}}}) 2025-09-27 00:59:11.605923 | orchestrator | changed: [testbed-node-2] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605932 | orchestrator | changed: [testbed-node-1] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605940 | orchestrator | changed: [testbed-node-0] => (item={'key': 'prometheus-elasticsearch-exporter', 'value': {'container_name': 'prometheus_elasticsearch_exporter', 'group': 'prometheus-elasticsearch-exporter', 'enabled': True, 'image': 'registry.osism.tech/kolla/prometheus-elasticsearch-exporter:2024.2', 'volumes': ['/etc/kolla/prometheus-elasticsearch-exporter/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}}}) 2025-09-27 00:59:11.605948 | orchestrator | 2025-09-27 00:59:11.605956 | orchestrator | TASK [prometheus : Creating prometheus database user and setting permissions] *** 2025-09-27 00:59:11.605964 | orchestrator | Saturday 27 September 2025 00:59:07 +0000 (0:00:03.714) 0:01:13.680 **** 2025-09-27 00:59:11.605977 | orchestrator | failed: [testbed-node-0] (item=testbed-node-0) => {"ansible_loop_var": "item", "changed": false, "item": {"key": "0", "value": {"hosts": ["testbed-node-0", "testbed-node-1", "testbed-node-2"]}}, "msg": "kolla_toolbox container is not running."} 2025-09-27 00:59:11.605987 | orchestrator | 2025-09-27 00:59:11.605995 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 00:59:11.606003 | orchestrator | testbed-manager : ok=11  changed=4  unreachable=0 failed=1  skipped=2  rescued=0 ignored=0 2025-09-27 00:59:11.606012 | orchestrator | testbed-node-0 : ok=11  changed=5  unreachable=0 failed=1  skipped=12  rescued=0 ignored=0 2025-09-27 00:59:11.606058 | orchestrator | testbed-node-1 : ok=10  changed=5  unreachable=0 failed=0 skipped=11  rescued=0 ignored=0 2025-09-27 00:59:11.606066 | orchestrator | testbed-node-2 : ok=10  changed=5  unreachable=0 failed=0 skipped=11  rescued=0 ignored=0 2025-09-27 00:59:11.606074 | orchestrator | testbed-node-3 : ok=9  changed=4  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2025-09-27 00:59:11.606088 | orchestrator | testbed-node-4 : ok=9  changed=4  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2025-09-27 00:59:11.606096 | orchestrator | testbed-node-5 : ok=9  changed=4  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2025-09-27 00:59:11.606104 | orchestrator | 2025-09-27 00:59:11.606112 | orchestrator | 2025-09-27 00:59:11.606120 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 00:59:11.606128 | orchestrator | Saturday 27 September 2025 00:59:09 +0000 (0:00:01.686) 0:01:15.367 **** 2025-09-27 00:59:11.606136 | orchestrator | =============================================================================== 2025-09-27 00:59:11.606143 | orchestrator | prometheus : Copying over custom prometheus alert rules files ---------- 23.12s 2025-09-27 00:59:11.606151 | orchestrator | prometheus : Copying over prometheus config file ------------------------ 9.86s 2025-09-27 00:59:11.606159 | orchestrator | prometheus : Copying over config.json files ----------------------------- 6.16s 2025-09-27 00:59:11.606167 | orchestrator | service-cert-copy : prometheus | Copying over extra CA certificates ----- 5.71s 2025-09-27 00:59:11.606175 | orchestrator | prometheus : Check prometheus containers -------------------------------- 3.71s 2025-09-27 00:59:11.606182 | orchestrator | prometheus : Ensuring config directories exist -------------------------- 3.09s 2025-09-27 00:59:11.606190 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS key --- 2.03s 2025-09-27 00:59:11.606202 | orchestrator | prometheus : Copying over my.cnf for mysqld_exporter -------------------- 1.76s 2025-09-27 00:59:11.606210 | orchestrator | prometheus : Creating prometheus database user and setting permissions --- 1.69s 2025-09-27 00:59:11.606218 | orchestrator | prometheus : Find prometheus host config overrides ---------------------- 1.60s 2025-09-27 00:59:11.606226 | orchestrator | prometheus : include_tasks ---------------------------------------------- 1.52s 2025-09-27 00:59:11.606233 | orchestrator | service-cert-copy : prometheus | Copying over backend internal TLS certificate --- 1.48s 2025-09-27 00:59:11.606241 | orchestrator | prometheus : Copying over prometheus web config file -------------------- 1.47s 2025-09-27 00:59:11.606249 | orchestrator | prometheus : include_tasks ---------------------------------------------- 1.39s 2025-09-27 00:59:11.606257 | orchestrator | prometheus : Copying cloud config file for openstack exporter ----------- 1.28s 2025-09-27 00:59:11.606265 | orchestrator | prometheus : Copying over prometheus alertmanager config file ----------- 1.16s 2025-09-27 00:59:11.606272 | orchestrator | prometheus : Copying config file for blackbox exporter ------------------ 1.07s 2025-09-27 00:59:11.606280 | orchestrator | prometheus : Find custom prometheus alert rules files ------------------- 1.04s 2025-09-27 00:59:11.606288 | orchestrator | Group hosts based on enabled services ----------------------------------- 1.01s 2025-09-27 00:59:11.606295 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.96s 2025-09-27 00:59:11.606303 | orchestrator | 2025-09-27 00:59:11 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:11.606731 | orchestrator | 2025-09-27 00:59:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:11.609409 | orchestrator | 2025-09-27 00:59:11 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:59:11.609490 | orchestrator | 2025-09-27 00:59:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:14.648894 | orchestrator | 2025-09-27 00:59:14 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:14.649001 | orchestrator | 2025-09-27 00:59:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:14.650178 | orchestrator | 2025-09-27 00:59:14 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:14.650935 | orchestrator | 2025-09-27 00:59:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:14.651650 | orchestrator | 2025-09-27 00:59:14 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:59:14.651966 | orchestrator | 2025-09-27 00:59:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:17.689102 | orchestrator | 2025-09-27 00:59:17 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:17.690309 | orchestrator | 2025-09-27 00:59:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:17.692239 | orchestrator | 2025-09-27 00:59:17 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:17.693992 | orchestrator | 2025-09-27 00:59:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:17.696447 | orchestrator | 2025-09-27 00:59:17 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state STARTED 2025-09-27 00:59:17.696475 | orchestrator | 2025-09-27 00:59:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:20.750608 | orchestrator | 2025-09-27 00:59:20 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:20.753288 | orchestrator | 2025-09-27 00:59:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:20.756793 | orchestrator | 2025-09-27 00:59:20 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:20.759266 | orchestrator | 2025-09-27 00:59:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:20.760859 | orchestrator | 2025-09-27 00:59:20 | INFO  | Task 452cb9cb-b9d2-4e68-b922-2c6a8a665f16 is in state SUCCESS 2025-09-27 00:59:20.760882 | orchestrator | 2025-09-27 00:59:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:23.800269 | orchestrator | 2025-09-27 00:59:23 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:23.801692 | orchestrator | 2025-09-27 00:59:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:23.803150 | orchestrator | 2025-09-27 00:59:23 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:23.804628 | orchestrator | 2025-09-27 00:59:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:23.804671 | orchestrator | 2025-09-27 00:59:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:26.846485 | orchestrator | 2025-09-27 00:59:26 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:26.847825 | orchestrator | 2025-09-27 00:59:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:26.848973 | orchestrator | 2025-09-27 00:59:26 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:26.850437 | orchestrator | 2025-09-27 00:59:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:26.850463 | orchestrator | 2025-09-27 00:59:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:29.894959 | orchestrator | 2025-09-27 00:59:29 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:29.895735 | orchestrator | 2025-09-27 00:59:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:29.896597 | orchestrator | 2025-09-27 00:59:29 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:29.897962 | orchestrator | 2025-09-27 00:59:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:29.898383 | orchestrator | 2025-09-27 00:59:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:32.962674 | orchestrator | 2025-09-27 00:59:32 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:32.964067 | orchestrator | 2025-09-27 00:59:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:32.964943 | orchestrator | 2025-09-27 00:59:32 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:32.966886 | orchestrator | 2025-09-27 00:59:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:32.966951 | orchestrator | 2025-09-27 00:59:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:36.030651 | orchestrator | 2025-09-27 00:59:36 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:36.031259 | orchestrator | 2025-09-27 00:59:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:36.032214 | orchestrator | 2025-09-27 00:59:36 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:36.033118 | orchestrator | 2025-09-27 00:59:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:36.033143 | orchestrator | 2025-09-27 00:59:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:39.074538 | orchestrator | 2025-09-27 00:59:39 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:39.075273 | orchestrator | 2025-09-27 00:59:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:39.076705 | orchestrator | 2025-09-27 00:59:39 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:39.078002 | orchestrator | 2025-09-27 00:59:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:39.078092 | orchestrator | 2025-09-27 00:59:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:42.128615 | orchestrator | 2025-09-27 00:59:42 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:42.129501 | orchestrator | 2025-09-27 00:59:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:42.132490 | orchestrator | 2025-09-27 00:59:42 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:42.135434 | orchestrator | 2025-09-27 00:59:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:42.135602 | orchestrator | 2025-09-27 00:59:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:45.180733 | orchestrator | 2025-09-27 00:59:45 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:45.182638 | orchestrator | 2025-09-27 00:59:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:45.183733 | orchestrator | 2025-09-27 00:59:45 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:45.185011 | orchestrator | 2025-09-27 00:59:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:45.185052 | orchestrator | 2025-09-27 00:59:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:48.232830 | orchestrator | 2025-09-27 00:59:48 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:48.234905 | orchestrator | 2025-09-27 00:59:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:48.237436 | orchestrator | 2025-09-27 00:59:48 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:48.239353 | orchestrator | 2025-09-27 00:59:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:48.239460 | orchestrator | 2025-09-27 00:59:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:51.292949 | orchestrator | 2025-09-27 00:59:51 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:51.295065 | orchestrator | 2025-09-27 00:59:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:51.297096 | orchestrator | 2025-09-27 00:59:51 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:51.299403 | orchestrator | 2025-09-27 00:59:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:51.299419 | orchestrator | 2025-09-27 00:59:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:54.347882 | orchestrator | 2025-09-27 00:59:54 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:54.349896 | orchestrator | 2025-09-27 00:59:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:54.351988 | orchestrator | 2025-09-27 00:59:54 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:54.353863 | orchestrator | 2025-09-27 00:59:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:54.353936 | orchestrator | 2025-09-27 00:59:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 00:59:57.396995 | orchestrator | 2025-09-27 00:59:57 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 00:59:57.399562 | orchestrator | 2025-09-27 00:59:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 00:59:57.401633 | orchestrator | 2025-09-27 00:59:57 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 00:59:57.403274 | orchestrator | 2025-09-27 00:59:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 00:59:57.403417 | orchestrator | 2025-09-27 00:59:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:00.436943 | orchestrator | 2025-09-27 01:00:00 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:00.438451 | orchestrator | 2025-09-27 01:00:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:00.442521 | orchestrator | 2025-09-27 01:00:00 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 01:00:00.444346 | orchestrator | 2025-09-27 01:00:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:00.444554 | orchestrator | 2025-09-27 01:00:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:03.486760 | orchestrator | 2025-09-27 01:00:03 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:03.489009 | orchestrator | 2025-09-27 01:00:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:03.491268 | orchestrator | 2025-09-27 01:00:03 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 01:00:03.495506 | orchestrator | 2025-09-27 01:00:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:03.495532 | orchestrator | 2025-09-27 01:00:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:06.528624 | orchestrator | 2025-09-27 01:00:06 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:06.530377 | orchestrator | 2025-09-27 01:00:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:06.531726 | orchestrator | 2025-09-27 01:00:06 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state STARTED 2025-09-27 01:00:06.533309 | orchestrator | 2025-09-27 01:00:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:06.533336 | orchestrator | 2025-09-27 01:00:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:09.575783 | orchestrator | 2025-09-27 01:00:09 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:09.577176 | orchestrator | 2025-09-27 01:00:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:09.580643 | orchestrator | 2025-09-27 01:00:09 | INFO  | Task 6dac5c74-45a5-4157-8ab5-943abd7ce6bf is in state SUCCESS 2025-09-27 01:00:09.582166 | orchestrator | 2025-09-27 01:00:09.582199 | orchestrator | 2025-09-27 01:00:09.582211 | orchestrator | PLAY [Bootstraph ceph dashboard] *********************************************** 2025-09-27 01:00:09.582222 | orchestrator | 2025-09-27 01:00:09.582233 | orchestrator | TASK [Disable the ceph dashboard] ********************************************** 2025-09-27 01:00:09.582245 | orchestrator | Saturday 27 September 2025 00:57:53 +0000 (0:00:00.279) 0:00:00.279 **** 2025-09-27 01:00:09.582255 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582267 | orchestrator | 2025-09-27 01:00:09.582278 | orchestrator | TASK [Set mgr/dashboard/ssl to false] ****************************************** 2025-09-27 01:00:09.582288 | orchestrator | Saturday 27 September 2025 00:57:55 +0000 (0:00:01.629) 0:00:01.909 **** 2025-09-27 01:00:09.582299 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582310 | orchestrator | 2025-09-27 01:00:09.582321 | orchestrator | TASK [Set mgr/dashboard/server_port to 7000] *********************************** 2025-09-27 01:00:09.582332 | orchestrator | Saturday 27 September 2025 00:57:56 +0000 (0:00:01.125) 0:00:03.034 **** 2025-09-27 01:00:09.582342 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582353 | orchestrator | 2025-09-27 01:00:09.582364 | orchestrator | TASK [Set mgr/dashboard/server_addr to 0.0.0.0] ******************************** 2025-09-27 01:00:09.582374 | orchestrator | Saturday 27 September 2025 00:57:57 +0000 (0:00:01.081) 0:00:04.116 **** 2025-09-27 01:00:09.582385 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582396 | orchestrator | 2025-09-27 01:00:09.582406 | orchestrator | TASK [Set mgr/dashboard/standby_behaviour to error] **************************** 2025-09-27 01:00:09.582417 | orchestrator | Saturday 27 September 2025 00:57:59 +0000 (0:00:01.653) 0:00:05.769 **** 2025-09-27 01:00:09.582428 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582438 | orchestrator | 2025-09-27 01:00:09.582449 | orchestrator | TASK [Set mgr/dashboard/standby_error_status_code to 404] ********************** 2025-09-27 01:00:09.582460 | orchestrator | Saturday 27 September 2025 00:58:00 +0000 (0:00:01.072) 0:00:06.842 **** 2025-09-27 01:00:09.582471 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582481 | orchestrator | 2025-09-27 01:00:09.582618 | orchestrator | TASK [Enable the ceph dashboard] *********************************************** 2025-09-27 01:00:09.582630 | orchestrator | Saturday 27 September 2025 00:58:01 +0000 (0:00:01.046) 0:00:07.888 **** 2025-09-27 01:00:09.582640 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582651 | orchestrator | 2025-09-27 01:00:09.582661 | orchestrator | TASK [Write ceph_dashboard_password to temporary file] ************************* 2025-09-27 01:00:09.582672 | orchestrator | Saturday 27 September 2025 00:58:03 +0000 (0:00:02.095) 0:00:09.984 **** 2025-09-27 01:00:09.582683 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582693 | orchestrator | 2025-09-27 01:00:09.582736 | orchestrator | TASK [Create admin user] ******************************************************* 2025-09-27 01:00:09.582747 | orchestrator | Saturday 27 September 2025 00:58:04 +0000 (0:00:01.181) 0:00:11.165 **** 2025-09-27 01:00:09.582758 | orchestrator | changed: [testbed-manager] 2025-09-27 01:00:09.582769 | orchestrator | 2025-09-27 01:00:09.582780 | orchestrator | TASK [Remove temporary file for ceph_dashboard_password] *********************** 2025-09-27 01:00:09.582870 | orchestrator | Saturday 27 September 2025 00:58:54 +0000 (0:00:49.467) 0:01:00.632 **** 2025-09-27 01:00:09.582883 | orchestrator | skipping: [testbed-manager] 2025-09-27 01:00:09.582895 | orchestrator | 2025-09-27 01:00:09.582905 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2025-09-27 01:00:09.582916 | orchestrator | 2025-09-27 01:00:09.582927 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2025-09-27 01:00:09.582938 | orchestrator | Saturday 27 September 2025 00:58:54 +0000 (0:00:00.239) 0:01:00.871 **** 2025-09-27 01:00:09.582948 | orchestrator | changed: [testbed-node-0] 2025-09-27 01:00:09.582959 | orchestrator | 2025-09-27 01:00:09.582970 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2025-09-27 01:00:09.582980 | orchestrator | 2025-09-27 01:00:09.582991 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2025-09-27 01:00:09.583002 | orchestrator | Saturday 27 September 2025 00:58:56 +0000 (0:00:01.618) 0:01:02.489 **** 2025-09-27 01:00:09.583013 | orchestrator | changed: [testbed-node-1] 2025-09-27 01:00:09.583047 | orchestrator | 2025-09-27 01:00:09.583059 | orchestrator | PLAY [Restart ceph manager services] ******************************************* 2025-09-27 01:00:09.583074 | orchestrator | 2025-09-27 01:00:09.583093 | orchestrator | TASK [Restart ceph manager service] ******************************************** 2025-09-27 01:00:09.583111 | orchestrator | Saturday 27 September 2025 00:59:07 +0000 (0:00:11.315) 0:01:13.805 **** 2025-09-27 01:00:09.583150 | orchestrator | changed: [testbed-node-2] 2025-09-27 01:00:09.583164 | orchestrator | 2025-09-27 01:00:09.583175 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 01:00:09.583187 | orchestrator | testbed-manager : ok=9  changed=9  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 2025-09-27 01:00:09.583198 | orchestrator | testbed-node-0 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 01:00:09.583209 | orchestrator | testbed-node-1 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 01:00:09.583220 | orchestrator | testbed-node-2 : ok=1  changed=1  unreachable=0 failed=0 skipped=0 rescued=0 ignored=0 2025-09-27 01:00:09.583230 | orchestrator | 2025-09-27 01:00:09.583241 | orchestrator | 2025-09-27 01:00:09.583252 | orchestrator | 2025-09-27 01:00:09.583262 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 01:00:09.583273 | orchestrator | Saturday 27 September 2025 00:59:18 +0000 (0:00:11.174) 0:01:24.980 **** 2025-09-27 01:00:09.583293 | orchestrator | =============================================================================== 2025-09-27 01:00:09.583305 | orchestrator | Create admin user ------------------------------------------------------ 49.47s 2025-09-27 01:00:09.583316 | orchestrator | Restart ceph manager service ------------------------------------------- 24.11s 2025-09-27 01:00:09.583340 | orchestrator | Enable the ceph dashboard ----------------------------------------------- 2.10s 2025-09-27 01:00:09.583351 | orchestrator | Set mgr/dashboard/server_addr to 0.0.0.0 -------------------------------- 1.65s 2025-09-27 01:00:09.583362 | orchestrator | Disable the ceph dashboard ---------------------------------------------- 1.63s 2025-09-27 01:00:09.583373 | orchestrator | Write ceph_dashboard_password to temporary file ------------------------- 1.18s 2025-09-27 01:00:09.583384 | orchestrator | Set mgr/dashboard/ssl to false ------------------------------------------ 1.13s 2025-09-27 01:00:09.583394 | orchestrator | Set mgr/dashboard/server_port to 7000 ----------------------------------- 1.08s 2025-09-27 01:00:09.583405 | orchestrator | Set mgr/dashboard/standby_behaviour to error ---------------------------- 1.07s 2025-09-27 01:00:09.583416 | orchestrator | Set mgr/dashboard/standby_error_status_code to 404 ---------------------- 1.05s 2025-09-27 01:00:09.583426 | orchestrator | Remove temporary file for ceph_dashboard_password ----------------------- 0.24s 2025-09-27 01:00:09.583446 | orchestrator | 2025-09-27 01:00:09.583456 | orchestrator | 2025-09-27 01:00:09.583467 | orchestrator | PLAY [Group hosts based on configuration] ************************************** 2025-09-27 01:00:09.583478 | orchestrator | 2025-09-27 01:00:09.583488 | orchestrator | TASK [Group hosts based on Kolla action] *************************************** 2025-09-27 01:00:09.583499 | orchestrator | Saturday 27 September 2025 00:59:13 +0000 (0:00:00.256) 0:00:00.256 **** 2025-09-27 01:00:09.583510 | orchestrator | ok: [testbed-node-0] 2025-09-27 01:00:09.583521 | orchestrator | ok: [testbed-node-1] 2025-09-27 01:00:09.583531 | orchestrator | ok: [testbed-node-2] 2025-09-27 01:00:09.583542 | orchestrator | 2025-09-27 01:00:09.583553 | orchestrator | TASK [Group hosts based on enabled services] *********************************** 2025-09-27 01:00:09.583563 | orchestrator | Saturday 27 September 2025 00:59:14 +0000 (0:00:00.299) 0:00:00.555 **** 2025-09-27 01:00:09.583574 | orchestrator | ok: [testbed-node-0] => (item=enable_grafana_True) 2025-09-27 01:00:09.583585 | orchestrator | ok: [testbed-node-1] => (item=enable_grafana_True) 2025-09-27 01:00:09.583595 | orchestrator | ok: [testbed-node-2] => (item=enable_grafana_True) 2025-09-27 01:00:09.583606 | orchestrator | 2025-09-27 01:00:09.583616 | orchestrator | PLAY [Apply role grafana] ****************************************************** 2025-09-27 01:00:09.583627 | orchestrator | 2025-09-27 01:00:09.583638 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2025-09-27 01:00:09.583648 | orchestrator | Saturday 27 September 2025 00:59:14 +0000 (0:00:00.411) 0:00:00.966 **** 2025-09-27 01:00:09.583659 | orchestrator | included: /ansible/roles/grafana/tasks/deploy.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 01:00:09.583670 | orchestrator | 2025-09-27 01:00:09.583681 | orchestrator | TASK [grafana : Ensuring config directories exist] ***************************** 2025-09-27 01:00:09.583691 | orchestrator | Saturday 27 September 2025 00:59:15 +0000 (0:00:00.495) 0:00:01.462 **** 2025-09-27 01:00:09.583705 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.583720 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.583737 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.583748 | orchestrator | 2025-09-27 01:00:09.583759 | orchestrator | TASK [grafana : Check if extra configuration file exists] ********************** 2025-09-27 01:00:09.583776 | orchestrator | Saturday 27 September 2025 00:59:15 +0000 (0:00:00.755) 0:00:02.217 **** 2025-09-27 01:00:09.583793 | orchestrator | [WARNING]: Skipped '/operations/prometheus/grafana' path due to this access 2025-09-27 01:00:09.583804 | orchestrator | issue: '/operations/prometheus/grafana' is not a directory 2025-09-27 01:00:09.583815 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 01:00:09.583826 | orchestrator | 2025-09-27 01:00:09.583836 | orchestrator | TASK [grafana : include_tasks] ************************************************* 2025-09-27 01:00:09.583847 | orchestrator | Saturday 27 September 2025 00:59:16 +0000 (0:00:00.836) 0:00:03.053 **** 2025-09-27 01:00:09.583858 | orchestrator | included: /ansible/roles/grafana/tasks/copy-certs.yml for testbed-node-0, testbed-node-1, testbed-node-2 2025-09-27 01:00:09.583869 | orchestrator | 2025-09-27 01:00:09.583880 | orchestrator | TASK [service-cert-copy : grafana | Copying over extra CA certificates] ******** 2025-09-27 01:00:09.583890 | orchestrator | Saturday 27 September 2025 00:59:17 +0000 (0:00:00.633) 0:00:03.687 **** 2025-09-27 01:00:09.583902 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.583914 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.583926 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.583937 | orchestrator | 2025-09-27 01:00:09.583948 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS certificate] *** 2025-09-27 01:00:09.583959 | orchestrator | Saturday 27 September 2025 00:59:18 +0000 (0:00:01.329) 0:00:05.017 **** 2025-09-27 01:00:09.583970 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 01:00:09.583999 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 01:00:09.584012 | orchestrator | skipping: [testbed-node-0] 2025-09-27 01:00:09.584044 | orchestrator | skipping: [testbed-node-1] 2025-09-27 01:00:09.584057 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 01:00:09.584069 | orchestrator | skipping: [testbed-node-2] 2025-09-27 01:00:09.584079 | orchestrator | 2025-09-27 01:00:09.584090 | orchestrator | TASK [service-cert-copy : grafana | Copying over backend internal TLS key] ***** 2025-09-27 01:00:09.584101 | orchestrator | Saturday 27 September 2025 00:59:19 +0000 (0:00:00.355) 0:00:05.373 **** 2025-09-27 01:00:09.584118 | orchestrator | skipping: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 01:00:09.584138 | orchestrator | skipping: [testbed-node-0] 2025-09-27 01:00:09.584158 | orchestrator | skipping: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 01:00:09.584179 | orchestrator | skipping: [testbed-node-1] 2025-09-27 01:00:09.584200 | orchestrator | skipping: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}})  2025-09-27 01:00:09.584228 | orchestrator | skipping: [testbed-node-2] 2025-09-27 01:00:09.584240 | orchestrator | 2025-09-27 01:00:09.584251 | orchestrator | TASK [grafana : Copying over config.json files] ******************************** 2025-09-27 01:00:09.584262 | orchestrator | Saturday 27 September 2025 00:59:19 +0000 (0:00:00.957) 0:00:06.330 **** 2025-09-27 01:00:09.584278 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.584299 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.584311 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.584322 | orchestrator | 2025-09-27 01:00:09.584333 | orchestrator | TASK [grafana : Copying over grafana.ini] ************************************** 2025-09-27 01:00:09.584344 | orchestrator | Saturday 27 September 2025 00:59:21 +0000 (0:00:01.324) 0:00:07.654 **** 2025-09-27 01:00:09.584355 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.584367 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.584384 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.584396 | orchestrator | 2025-09-27 01:00:09.584407 | orchestrator | TASK [grafana : Copying over extra configuration file] ************************* 2025-09-27 01:00:09.584418 | orchestrator | Saturday 27 September 2025 00:59:22 +0000 (0:00:01.327) 0:00:08.981 **** 2025-09-27 01:00:09.584429 | orchestrator | skipping: [testbed-node-0] 2025-09-27 01:00:09.584440 | orchestrator | skipping: [testbed-node-1] 2025-09-27 01:00:09.584451 | orchestrator | skipping: [testbed-node-2] 2025-09-27 01:00:09.584462 | orchestrator | 2025-09-27 01:00:09.584473 | orchestrator | TASK [grafana : Configuring Prometheus as data source for Grafana] ************* 2025-09-27 01:00:09.584483 | orchestrator | Saturday 27 September 2025 00:59:23 +0000 (0:00:00.504) 0:00:09.485 **** 2025-09-27 01:00:09.584499 | orchestrator | changed: [testbed-node-0] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2025-09-27 01:00:09.584510 | orchestrator | changed: [testbed-node-1] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2025-09-27 01:00:09.584521 | orchestrator | changed: [testbed-node-2] => (item=/ansible/roles/grafana/templates/prometheus.yaml.j2) 2025-09-27 01:00:09.584532 | orchestrator | 2025-09-27 01:00:09.584547 | orchestrator | TASK [grafana : Configuring dashboards provisioning] *************************** 2025-09-27 01:00:09.584558 | orchestrator | Saturday 27 September 2025 00:59:24 +0000 (0:00:01.359) 0:00:10.845 **** 2025-09-27 01:00:09.584569 | orchestrator | changed: [testbed-node-0] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2025-09-27 01:00:09.584580 | orchestrator | changed: [testbed-node-1] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2025-09-27 01:00:09.584591 | orchestrator | changed: [testbed-node-2] => (item=/opt/configuration/environments/kolla/files/overlays/grafana/provisioning.yaml) 2025-09-27 01:00:09.584602 | orchestrator | 2025-09-27 01:00:09.584613 | orchestrator | TASK [grafana : Find custom grafana dashboards] ******************************** 2025-09-27 01:00:09.584624 | orchestrator | Saturday 27 September 2025 00:59:25 +0000 (0:00:01.355) 0:00:12.200 **** 2025-09-27 01:00:09.584635 | orchestrator | ok: [testbed-node-0 -> localhost] 2025-09-27 01:00:09.584645 | orchestrator | 2025-09-27 01:00:09.584656 | orchestrator | TASK [grafana : Find templated grafana dashboards] ***************************** 2025-09-27 01:00:09.584667 | orchestrator | Saturday 27 September 2025 00:59:26 +0000 (0:00:00.715) 0:00:12.916 **** 2025-09-27 01:00:09.584678 | orchestrator | [WARNING]: Skipped '/etc/kolla/grafana/dashboards' path due to this access 2025-09-27 01:00:09.584689 | orchestrator | issue: '/etc/kolla/grafana/dashboards' is not a directory 2025-09-27 01:00:09.584700 | orchestrator | ok: [testbed-node-0] 2025-09-27 01:00:09.584710 | orchestrator | ok: [testbed-node-1] 2025-09-27 01:00:09.584721 | orchestrator | ok: [testbed-node-2] 2025-09-27 01:00:09.584732 | orchestrator | 2025-09-27 01:00:09.584743 | orchestrator | TASK [grafana : Prune templated Grafana dashboards] **************************** 2025-09-27 01:00:09.584754 | orchestrator | Saturday 27 September 2025 00:59:27 +0000 (0:00:00.782) 0:00:13.698 **** 2025-09-27 01:00:09.584764 | orchestrator | skipping: [testbed-node-0] 2025-09-27 01:00:09.584775 | orchestrator | skipping: [testbed-node-1] 2025-09-27 01:00:09.584786 | orchestrator | skipping: [testbed-node-2] 2025-09-27 01:00:09.584797 | orchestrator | 2025-09-27 01:00:09.584808 | orchestrator | TASK [grafana : Copying over custom dashboards] ******************************** 2025-09-27 01:00:09.584818 | orchestrator | Saturday 27 September 2025 00:59:27 +0000 (0:00:00.493) 0:00:14.192 **** 2025-09-27 01:00:09.584835 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/ceph-cluster-advanced.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster-advanced.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 117836, 'inode': 1090200, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4532087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584848 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/ceph-cluster-advanced.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster-advanced.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 117836, 'inode': 1090200, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4532087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584859 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/ceph-cluster-advanced.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster-advanced.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 117836, 'inode': 1090200, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4532087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584886 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/rbd-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25686, 'inode': 1090250, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4669797, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584899 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/rbd-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25686, 'inode': 1090250, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4669797, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584910 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/rbd-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25686, 'inode': 1090250, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4669797, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584927 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/ceph_pools.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_pools.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25279, 'inode': 1090212, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4554462, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584939 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/ceph_pools.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_pools.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25279, 'inode': 1090212, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4554462, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584950 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/ceph_pools.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_pools.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 25279, 'inode': 1090212, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4554462, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584966 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/rgw-s3-analytics.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rgw-s3-analytics.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 167897, 'inode': 1090252, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.46945, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584984 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/rgw-s3-analytics.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rgw-s3-analytics.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 167897, 'inode': 1090252, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.46945, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.584995 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/rgw-s3-analytics.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rgw-s3-analytics.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 167897, 'inode': 1090252, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.46945, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585007 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/osd-device-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osd-device-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 26655, 'inode': 1090225, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.459104, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585060 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/osd-device-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osd-device-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 26655, 'inode': 1090225, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.459104, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585074 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/osd-device-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osd-device-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 26655, 'inode': 1090225, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.459104, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585086 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/radosgw-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 39556, 'inode': 1090244, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4653425, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585655 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/radosgw-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 39556, 'inode': 1090244, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4653425, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585682 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/radosgw-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 39556, 'inode': 1090244, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4653425, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585694 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/README.md', 'value': {'path': '/operations/grafana/dashboards/ceph/README.md', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 84, 'inode': 1090198, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4513922, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585716 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/README.md', 'value': {'path': '/operations/grafana/dashboards/ceph/README.md', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 84, 'inode': 1090198, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4513922, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585727 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/README.md', 'value': {'path': '/operations/grafana/dashboards/ceph/README.md', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 84, 'inode': 1090198, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4513922, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585739 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/ceph-cluster.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 34113, 'inode': 1090206, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4532087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585754 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/ceph-cluster.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 34113, 'inode': 1090206, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4532087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585774 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/ceph-cluster.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph-cluster.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 34113, 'inode': 1090206, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4532087, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585786 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/cephfs-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/cephfs-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9025, 'inode': 1090215, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4554462, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585803 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/cephfs-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/cephfs-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9025, 'inode': 1090215, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4554462, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585815 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/cephfs-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/cephfs-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9025, 'inode': 1090215, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4554462, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585826 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/pool-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19609, 'inode': 1090237, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4613044, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585837 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/pool-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19609, 'inode': 1090237, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4613044, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585860 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/pool-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19609, 'inode': 1090237, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4613044, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585872 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/rbd-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12997, 'inode': 1090249, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4665055, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585890 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/rbd-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12997, 'inode': 1090249, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4665055, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585902 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/rbd-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/rbd-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 12997, 'inode': 1090249, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4665055, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585913 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/ceph_overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 80386, 'inode': 1090208, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4540625, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585924 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/ceph_overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 80386, 'inode': 1090208, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4540625, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585945 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/ceph_overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/ceph_overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 80386, 'inode': 1090208, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4540625, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585957 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/radosgw-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19695, 'inode': 1090242, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4641697, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585974 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/radosgw-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19695, 'inode': 1090242, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4641697, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585986 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/radosgw-detail.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-detail.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 19695, 'inode': 1090242, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4641697, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.585997 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/osds-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osds-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38432, 'inode': 1090229, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4601247, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586009 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/osds-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osds-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38432, 'inode': 1090229, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4601247, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586088 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/osds-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/osds-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38432, 'inode': 1090229, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4601247, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586109 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/multi-cluster-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/multi-cluster-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 62676, 'inode': 1090221, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4583533, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586128 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/multi-cluster-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/multi-cluster-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 62676, 'inode': 1090221, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4583533, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586139 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/multi-cluster-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/multi-cluster-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 62676, 'inode': 1090221, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4583533, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586151 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/hosts-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/hosts-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 27218, 'inode': 1090219, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4562101, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586163 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/hosts-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/hosts-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 27218, 'inode': 1090219, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4562101, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586174 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/hosts-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/hosts-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 27218, 'inode': 1090219, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4562101, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586207 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/pool-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 49139, 'inode': 1090238, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.463111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586246 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/pool-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 49139, 'inode': 1090238, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.463111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586269 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/pool-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/pool-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 49139, 'inode': 1090238, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.463111, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586293 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/host-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/host-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 44791, 'inode': 1090216, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4561553, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586313 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/host-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/host-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 44791, 'inode': 1090216, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4561553, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586327 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/host-details.json', 'value': {'path': '/operations/grafana/dashboards/ceph/host-details.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 44791, 'inode': 1090216, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4561553, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586352 | orchestrator | changed: [testbed-node-1] => (item={'key': 'ceph/radosgw-sync-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-sync-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16156, 'inode': 1090247, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4653425, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586373 | orchestrator | changed: [testbed-node-0] => (item={'key': 'ceph/radosgw-sync-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-sync-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16156, 'inode': 1090247, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4653425, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586387 | orchestrator | changed: [testbed-node-2] => (item={'key': 'ceph/radosgw-sync-overview.json', 'value': {'path': '/operations/grafana/dashboards/ceph/radosgw-sync-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16156, 'inode': 1090247, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4653425, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586399 | orchestrator | changed: [testbed-node-1] => (item={'key': 'openstack/openstack.json', 'value': {'path': '/operations/grafana/dashboards/openstack/openstack.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 57270, 'inode': 1090373, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.510211, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586413 | orchestrator | changed: [testbed-node-0] => (item={'key': 'openstack/openstack.json', 'value': {'path': '/operations/grafana/dashboards/openstack/openstack.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 57270, 'inode': 1090373, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.510211, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586426 | orchestrator | changed: [testbed-node-2] => (item={'key': 'openstack/openstack.json', 'value': {'path': '/operations/grafana/dashboards/openstack/openstack.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 57270, 'inode': 1090373, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.510211, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586443 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/haproxy.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/haproxy.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 410814, 'inode': 1090280, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4859092, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586469 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/haproxy.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/haproxy.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 410814, 'inode': 1090280, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4859092, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586482 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/haproxy.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/haproxy.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 410814, 'inode': 1090280, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4859092, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586495 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/database.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/database.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 30898, 'inode': 1090265, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4758847, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586509 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/database.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/database.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 30898, 'inode': 1090265, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4758847, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586522 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/database.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/database.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 30898, 'inode': 1090265, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4758847, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586535 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/node-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 15725, 'inode': 1090309, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.489365, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586565 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/node-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 15725, 'inode': 1090309, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.489365, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586580 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/node-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 15725, 'inode': 1090309, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.489365, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586594 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/alertmanager-overview.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/alertmanager-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9645, 'inode': 1090257, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4702106, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586605 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/alertmanager-overview.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/alertmanager-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9645, 'inode': 1090257, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4702106, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586617 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/alertmanager-overview.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/alertmanager-overview.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 9645, 'inode': 1090257, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4702106, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586628 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/opensearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/opensearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 65458, 'inode': 1090358, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5019805, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586656 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/opensearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/opensearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 65458, 'inode': 1090358, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5019805, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586668 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/opensearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/opensearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 65458, 'inode': 1090358, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5019805, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586680 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/node_exporter_full.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_full.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 682774, 'inode': 1090312, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4969218, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586691 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/node_exporter_full.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_full.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 682774, 'inode': 1090312, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4969218, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586703 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/node_exporter_full.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_full.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 682774, 'inode': 1090312, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4969218, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586714 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/prometheus-remote-write.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus-remote-write.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 22317, 'inode': 1090361, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5019805, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586741 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/prometheus-remote-write.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus-remote-write.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 22317, 'inode': 1090361, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5019805, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586753 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/prometheus-remote-write.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus-remote-write.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 22317, 'inode': 1090361, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5019805, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586764 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/redfish.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/redfish.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38087, 'inode': 1090371, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.507211, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586775 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/redfish.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/redfish.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38087, 'inode': 1090371, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.507211, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586787 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/redfish.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/redfish.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 38087, 'inode': 1090371, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.507211, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586798 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/nodes.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/nodes.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21109, 'inode': 1090350, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.498581, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586820 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/nodes.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/nodes.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21109, 'inode': 1090350, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.498581, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586838 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/nodes.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/nodes.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21109, 'inode': 1090350, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.498581, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586849 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/memcached.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/memcached.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 24243, 'inode': 1090301, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4884803, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586861 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/memcached.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/memcached.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 24243, 'inode': 1090301, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4884803, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586872 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/memcached.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/memcached.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 24243, 'inode': 1090301, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4884803, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586884 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/fluentd.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/fluentd.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 82960, 'inode': 1090276, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4819665, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586905 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/fluentd.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/fluentd.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 82960, 'inode': 1090276, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4819665, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586923 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/fluentd.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/fluentd.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 82960, 'inode': 1090276, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4819665, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586935 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/libvirt.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/libvirt.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 29672, 'inode': 1090294, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4866474, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586946 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/libvirt.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/libvirt.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 29672, 'inode': 1090294, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4866474, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586958 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/libvirt.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/libvirt.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 29672, 'inode': 1090294, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4866474, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586969 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/elasticsearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/elasticsearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 187864, 'inode': 1090268, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4799588, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.586989 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/elasticsearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/elasticsearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 187864, 'inode': 1090268, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4799588, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587010 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/elasticsearch.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/elasticsearch.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 187864, 'inode': 1090268, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4799588, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587044 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/node-cluster-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-cluster-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16098, 'inode': 1090306, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.488969, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587060 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/node-cluster-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-cluster-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16098, 'inode': 1090306, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.488969, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587071 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/node-cluster-rsrc-use.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node-cluster-rsrc-use.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 16098, 'inode': 1090306, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.488969, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587083 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/rabbitmq.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/rabbitmq.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 222049, 'inode': 1090367, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5067127, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587101 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/rabbitmq.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/rabbitmq.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 222049, 'inode': 1090367, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5067127, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587123 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/rabbitmq.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/rabbitmq.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 222049, 'inode': 1090367, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.5067127, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587135 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/prometheus_alertmanager.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus_alertmanager.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 115472, 'inode': 1090365, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.503282, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587146 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/prometheus_alertmanager.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus_alertmanager.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 115472, 'inode': 1090365, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.503282, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587157 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/prometheus_alertmanager.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus_alertmanager.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 115472, 'inode': 1090365, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.503282, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587169 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/blackbox.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/blackbox.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 31128, 'inode': 1090258, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4728014, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587187 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/blackbox.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/blackbox.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 31128, 'inode': 1090258, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4728014, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587208 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/blackbox.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/blackbox.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 31128, 'inode': 1090258, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4728014, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587220 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/cadvisor.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/cadvisor.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 53882, 'inode': 1090263, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4743392, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587231 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/cadvisor.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/cadvisor.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 53882, 'inode': 1090263, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4743392, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587248 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/cadvisor.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/cadvisor.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 53882, 'inode': 1090263, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4743392, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587269 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/node_exporter_side_by_side.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_side_by_side.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 70691, 'inode': 1090344, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4979408, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587301 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/node_exporter_side_by_side.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_side_by_side.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 70691, 'inode': 1090344, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4979408, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587321 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/node_exporter_side_by_side.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/node_exporter_side_by_side.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 70691, 'inode': 1090344, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.4979408, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587386 | orchestrator | changed: [testbed-node-0] => (item={'key': 'infrastructure/prometheus.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21898, 'inode': 1090363, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.503282, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587400 | orchestrator | changed: [testbed-node-1] => (item={'key': 'infrastructure/prometheus.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21898, 'inode': 1090363, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.503282, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587412 | orchestrator | changed: [testbed-node-2] => (item={'key': 'infrastructure/prometheus.json', 'value': {'path': '/operations/grafana/dashboards/infrastructure/prometheus.json', 'mode': '0644', 'isdir': False, 'ischr': False, 'isblk': False, 'isreg': True, 'isfifo': False, 'islnk': False, 'issock': False, 'uid': 0, 'gid': 0, 'size': 21898, 'inode': 1090363, 'dev': 112, 'nlink': 1, 'atime': 1758931333.0, 'mtime': 1758931333.0, 'ctime': 1758932198.503282, 'gr_name': 'root', 'pw_name': 'root', 'wusr': True, 'rusr': True, 'xusr': False, 'wgrp': False, 'rgrp': True, 'xgrp': False, 'woth': False, 'roth': True, 'xoth': False, 'isuid': False, 'isgid': False}}) 2025-09-27 01:00:09.587423 | orchestrator | 2025-09-27 01:00:09.587435 | orchestrator | TASK [grafana : Check grafana containers] ************************************** 2025-09-27 01:00:09.587446 | orchestrator | Saturday 27 September 2025 01:00:05 +0000 (0:00:37.387) 0:00:51.580 **** 2025-09-27 01:00:09.587493 | orchestrator | changed: [testbed-node-0] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.587506 | orchestrator | changed: [testbed-node-1] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.587522 | orchestrator | changed: [testbed-node-2] => (item={'key': 'grafana', 'value': {'container_name': 'grafana', 'group': 'grafana', 'enabled': True, 'image': 'registry.osism.tech/kolla/grafana:2024.2', 'volumes': ['/etc/kolla/grafana/:/var/lib/kolla/config_files/:ro', '/etc/localtime:/etc/localtime:ro', '/etc/timezone:/etc/timezone:ro', 'kolla_logs:/var/log/kolla/'], 'dimensions': {}, 'haproxy': {'grafana_server': {'enabled': 'yes', 'mode': 'http', 'external': False, 'port': '3000', 'listen_port': '3000'}, 'grafana_server_external': {'enabled': True, 'mode': 'http', 'external': True, 'external_fqdn': 'api.testbed.osism.xyz', 'port': '3000', 'listen_port': '3000'}}}}) 2025-09-27 01:00:09.587534 | orchestrator | 2025-09-27 01:00:09.587551 | orchestrator | TASK [grafana : Creating grafana database] ************************************* 2025-09-27 01:00:09.587562 | orchestrator | Saturday 27 September 2025 01:00:06 +0000 (0:00:00.863) 0:00:52.443 **** 2025-09-27 01:00:09.587574 | orchestrator | fatal: [testbed-node-0]: FAILED! => {"changed": false, "msg": "kolla_toolbox container is not running."} 2025-09-27 01:00:09.587585 | orchestrator | 2025-09-27 01:00:09.587596 | orchestrator | PLAY RECAP ********************************************************************* 2025-09-27 01:00:09.587607 | orchestrator | testbed-node-0 : ok=15  changed=8  unreachable=0 failed=1  skipped=4  rescued=0 ignored=0 2025-09-27 01:00:09.587618 | orchestrator | testbed-node-1 : ok=13  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2025-09-27 01:00:09.587630 | orchestrator | testbed-node-2 : ok=13  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2025-09-27 01:00:09.587640 | orchestrator | 2025-09-27 01:00:09.587651 | orchestrator | 2025-09-27 01:00:09.587662 | orchestrator | TASKS RECAP ******************************************************************** 2025-09-27 01:00:09.587673 | orchestrator | Saturday 27 September 2025 01:00:06 +0000 (0:00:00.727) 0:00:53.171 **** 2025-09-27 01:00:09.587684 | orchestrator | =============================================================================== 2025-09-27 01:00:09.587695 | orchestrator | grafana : Copying over custom dashboards ------------------------------- 37.39s 2025-09-27 01:00:09.587705 | orchestrator | grafana : Configuring Prometheus as data source for Grafana ------------- 1.36s 2025-09-27 01:00:09.587716 | orchestrator | grafana : Configuring dashboards provisioning --------------------------- 1.36s 2025-09-27 01:00:09.587727 | orchestrator | service-cert-copy : grafana | Copying over extra CA certificates -------- 1.33s 2025-09-27 01:00:09.587744 | orchestrator | grafana : Copying over grafana.ini -------------------------------------- 1.33s 2025-09-27 01:00:09.587755 | orchestrator | grafana : Copying over config.json files -------------------------------- 1.32s 2025-09-27 01:00:09.587765 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS key ----- 0.96s 2025-09-27 01:00:09.587776 | orchestrator | grafana : Check grafana containers -------------------------------------- 0.86s 2025-09-27 01:00:09.587787 | orchestrator | grafana : Check if extra configuration file exists ---------------------- 0.84s 2025-09-27 01:00:09.587798 | orchestrator | grafana : Find templated grafana dashboards ----------------------------- 0.78s 2025-09-27 01:00:09.587809 | orchestrator | grafana : Ensuring config directories exist ----------------------------- 0.76s 2025-09-27 01:00:09.587819 | orchestrator | grafana : Creating grafana database ------------------------------------- 0.73s 2025-09-27 01:00:09.587830 | orchestrator | grafana : Find custom grafana dashboards -------------------------------- 0.72s 2025-09-27 01:00:09.587841 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.63s 2025-09-27 01:00:09.587852 | orchestrator | grafana : Copying over extra configuration file ------------------------- 0.50s 2025-09-27 01:00:09.587862 | orchestrator | grafana : include_tasks ------------------------------------------------- 0.50s 2025-09-27 01:00:09.587873 | orchestrator | grafana : Prune templated Grafana dashboards ---------------------------- 0.49s 2025-09-27 01:00:09.587884 | orchestrator | Group hosts based on enabled services ----------------------------------- 0.41s 2025-09-27 01:00:09.587894 | orchestrator | service-cert-copy : grafana | Copying over backend internal TLS certificate --- 0.36s 2025-09-27 01:00:09.587905 | orchestrator | Group hosts based on Kolla action --------------------------------------- 0.30s 2025-09-27 01:00:09.587916 | orchestrator | 2025-09-27 01:00:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:09.587927 | orchestrator | 2025-09-27 01:00:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:12.624730 | orchestrator | 2025-09-27 01:00:12 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:12.626218 | orchestrator | 2025-09-27 01:00:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:12.628249 | orchestrator | 2025-09-27 01:00:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:12.628327 | orchestrator | 2025-09-27 01:00:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:15.666310 | orchestrator | 2025-09-27 01:00:15 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:15.667908 | orchestrator | 2025-09-27 01:00:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:15.669723 | orchestrator | 2025-09-27 01:00:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:15.669753 | orchestrator | 2025-09-27 01:00:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:18.710510 | orchestrator | 2025-09-27 01:00:18 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:18.712213 | orchestrator | 2025-09-27 01:00:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:18.713674 | orchestrator | 2025-09-27 01:00:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:18.713698 | orchestrator | 2025-09-27 01:00:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:21.754504 | orchestrator | 2025-09-27 01:00:21 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:21.756612 | orchestrator | 2025-09-27 01:00:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:21.758436 | orchestrator | 2025-09-27 01:00:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:21.758581 | orchestrator | 2025-09-27 01:00:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:24.794895 | orchestrator | 2025-09-27 01:00:24 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:24.797972 | orchestrator | 2025-09-27 01:00:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:24.800168 | orchestrator | 2025-09-27 01:00:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:24.800197 | orchestrator | 2025-09-27 01:00:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:27.851370 | orchestrator | 2025-09-27 01:00:27 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:27.852115 | orchestrator | 2025-09-27 01:00:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:27.854612 | orchestrator | 2025-09-27 01:00:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:27.854689 | orchestrator | 2025-09-27 01:00:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:30.904147 | orchestrator | 2025-09-27 01:00:30 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:30.905432 | orchestrator | 2025-09-27 01:00:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:30.907787 | orchestrator | 2025-09-27 01:00:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:30.907820 | orchestrator | 2025-09-27 01:00:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:33.957884 | orchestrator | 2025-09-27 01:00:33 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:33.960810 | orchestrator | 2025-09-27 01:00:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:33.963866 | orchestrator | 2025-09-27 01:00:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:33.963991 | orchestrator | 2025-09-27 01:00:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:37.008731 | orchestrator | 2025-09-27 01:00:37 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:37.010968 | orchestrator | 2025-09-27 01:00:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:37.013116 | orchestrator | 2025-09-27 01:00:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:37.013153 | orchestrator | 2025-09-27 01:00:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:40.055821 | orchestrator | 2025-09-27 01:00:40 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:40.057552 | orchestrator | 2025-09-27 01:00:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:40.059314 | orchestrator | 2025-09-27 01:00:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:40.059426 | orchestrator | 2025-09-27 01:00:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:43.099765 | orchestrator | 2025-09-27 01:00:43 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:43.101828 | orchestrator | 2025-09-27 01:00:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:43.104412 | orchestrator | 2025-09-27 01:00:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:43.104478 | orchestrator | 2025-09-27 01:00:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:46.141387 | orchestrator | 2025-09-27 01:00:46 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:46.143565 | orchestrator | 2025-09-27 01:00:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:46.145098 | orchestrator | 2025-09-27 01:00:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:46.145135 | orchestrator | 2025-09-27 01:00:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:49.190600 | orchestrator | 2025-09-27 01:00:49 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:49.191773 | orchestrator | 2025-09-27 01:00:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:49.193568 | orchestrator | 2025-09-27 01:00:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:49.193593 | orchestrator | 2025-09-27 01:00:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:52.234587 | orchestrator | 2025-09-27 01:00:52 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:52.236188 | orchestrator | 2025-09-27 01:00:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:52.237963 | orchestrator | 2025-09-27 01:00:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:52.237984 | orchestrator | 2025-09-27 01:00:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:55.281627 | orchestrator | 2025-09-27 01:00:55 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:55.282695 | orchestrator | 2025-09-27 01:00:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:55.285008 | orchestrator | 2025-09-27 01:00:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:55.285082 | orchestrator | 2025-09-27 01:00:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:00:58.319807 | orchestrator | 2025-09-27 01:00:58 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:00:58.320923 | orchestrator | 2025-09-27 01:00:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:00:58.322606 | orchestrator | 2025-09-27 01:00:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:00:58.322636 | orchestrator | 2025-09-27 01:00:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:01.369799 | orchestrator | 2025-09-27 01:01:01 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:01.371146 | orchestrator | 2025-09-27 01:01:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:01.374581 | orchestrator | 2025-09-27 01:01:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:01.374608 | orchestrator | 2025-09-27 01:01:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:04.413083 | orchestrator | 2025-09-27 01:01:04 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:04.414390 | orchestrator | 2025-09-27 01:01:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:04.416089 | orchestrator | 2025-09-27 01:01:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:04.416110 | orchestrator | 2025-09-27 01:01:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:07.458884 | orchestrator | 2025-09-27 01:01:07 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:07.460743 | orchestrator | 2025-09-27 01:01:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:07.462388 | orchestrator | 2025-09-27 01:01:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:07.462411 | orchestrator | 2025-09-27 01:01:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:10.502910 | orchestrator | 2025-09-27 01:01:10 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:10.504970 | orchestrator | 2025-09-27 01:01:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:10.506526 | orchestrator | 2025-09-27 01:01:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:10.506551 | orchestrator | 2025-09-27 01:01:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:13.550368 | orchestrator | 2025-09-27 01:01:13 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:13.551997 | orchestrator | 2025-09-27 01:01:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:13.553870 | orchestrator | 2025-09-27 01:01:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:13.554159 | orchestrator | 2025-09-27 01:01:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:16.598370 | orchestrator | 2025-09-27 01:01:16 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:16.600300 | orchestrator | 2025-09-27 01:01:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:16.601484 | orchestrator | 2025-09-27 01:01:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:16.601513 | orchestrator | 2025-09-27 01:01:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:19.646565 | orchestrator | 2025-09-27 01:01:19 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:19.647836 | orchestrator | 2025-09-27 01:01:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:19.649882 | orchestrator | 2025-09-27 01:01:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:19.650154 | orchestrator | 2025-09-27 01:01:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:22.692183 | orchestrator | 2025-09-27 01:01:22 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:22.692650 | orchestrator | 2025-09-27 01:01:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:22.695096 | orchestrator | 2025-09-27 01:01:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:22.695308 | orchestrator | 2025-09-27 01:01:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:25.742572 | orchestrator | 2025-09-27 01:01:25 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:25.743908 | orchestrator | 2025-09-27 01:01:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:25.745642 | orchestrator | 2025-09-27 01:01:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:25.745756 | orchestrator | 2025-09-27 01:01:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:28.793589 | orchestrator | 2025-09-27 01:01:28 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:28.795735 | orchestrator | 2025-09-27 01:01:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:28.797908 | orchestrator | 2025-09-27 01:01:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:28.797931 | orchestrator | 2025-09-27 01:01:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:31.839995 | orchestrator | 2025-09-27 01:01:31 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:31.843224 | orchestrator | 2025-09-27 01:01:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:31.844955 | orchestrator | 2025-09-27 01:01:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:31.844975 | orchestrator | 2025-09-27 01:01:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:34.890645 | orchestrator | 2025-09-27 01:01:34 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:34.892545 | orchestrator | 2025-09-27 01:01:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:34.894612 | orchestrator | 2025-09-27 01:01:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:34.894636 | orchestrator | 2025-09-27 01:01:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:37.935366 | orchestrator | 2025-09-27 01:01:37 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:37.937845 | orchestrator | 2025-09-27 01:01:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:37.940608 | orchestrator | 2025-09-27 01:01:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:37.940795 | orchestrator | 2025-09-27 01:01:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:40.988839 | orchestrator | 2025-09-27 01:01:40 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:40.989360 | orchestrator | 2025-09-27 01:01:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:40.991185 | orchestrator | 2025-09-27 01:01:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:40.991440 | orchestrator | 2025-09-27 01:01:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:44.032162 | orchestrator | 2025-09-27 01:01:44 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:44.032550 | orchestrator | 2025-09-27 01:01:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:44.033437 | orchestrator | 2025-09-27 01:01:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:44.033615 | orchestrator | 2025-09-27 01:01:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:47.079897 | orchestrator | 2025-09-27 01:01:47 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:47.082613 | orchestrator | 2025-09-27 01:01:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:47.085888 | orchestrator | 2025-09-27 01:01:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:47.085920 | orchestrator | 2025-09-27 01:01:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:50.130351 | orchestrator | 2025-09-27 01:01:50 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:50.132730 | orchestrator | 2025-09-27 01:01:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:50.133720 | orchestrator | 2025-09-27 01:01:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:50.133746 | orchestrator | 2025-09-27 01:01:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:53.177720 | orchestrator | 2025-09-27 01:01:53 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:53.178833 | orchestrator | 2025-09-27 01:01:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:53.180698 | orchestrator | 2025-09-27 01:01:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:53.180723 | orchestrator | 2025-09-27 01:01:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:56.228777 | orchestrator | 2025-09-27 01:01:56 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:56.230310 | orchestrator | 2025-09-27 01:01:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:56.232383 | orchestrator | 2025-09-27 01:01:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:56.232580 | orchestrator | 2025-09-27 01:01:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:01:59.275101 | orchestrator | 2025-09-27 01:01:59 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:01:59.276641 | orchestrator | 2025-09-27 01:01:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:01:59.278425 | orchestrator | 2025-09-27 01:01:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:01:59.278772 | orchestrator | 2025-09-27 01:01:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:02.323137 | orchestrator | 2025-09-27 01:02:02 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:02.324180 | orchestrator | 2025-09-27 01:02:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:02.325557 | orchestrator | 2025-09-27 01:02:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:02.325581 | orchestrator | 2025-09-27 01:02:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:05.370336 | orchestrator | 2025-09-27 01:02:05 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:05.372981 | orchestrator | 2025-09-27 01:02:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:05.374352 | orchestrator | 2025-09-27 01:02:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:05.374568 | orchestrator | 2025-09-27 01:02:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:08.415463 | orchestrator | 2025-09-27 01:02:08 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:08.417139 | orchestrator | 2025-09-27 01:02:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:08.418794 | orchestrator | 2025-09-27 01:02:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:08.418889 | orchestrator | 2025-09-27 01:02:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:11.461427 | orchestrator | 2025-09-27 01:02:11 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:11.462099 | orchestrator | 2025-09-27 01:02:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:11.464238 | orchestrator | 2025-09-27 01:02:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:11.464319 | orchestrator | 2025-09-27 01:02:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:14.509630 | orchestrator | 2025-09-27 01:02:14 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:14.511129 | orchestrator | 2025-09-27 01:02:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:14.513405 | orchestrator | 2025-09-27 01:02:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:14.513807 | orchestrator | 2025-09-27 01:02:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:17.556624 | orchestrator | 2025-09-27 01:02:17 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:17.558501 | orchestrator | 2025-09-27 01:02:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:17.560726 | orchestrator | 2025-09-27 01:02:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:17.560754 | orchestrator | 2025-09-27 01:02:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:20.604263 | orchestrator | 2025-09-27 01:02:20 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:20.605539 | orchestrator | 2025-09-27 01:02:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:20.607180 | orchestrator | 2025-09-27 01:02:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:20.607206 | orchestrator | 2025-09-27 01:02:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:23.650188 | orchestrator | 2025-09-27 01:02:23 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:23.650367 | orchestrator | 2025-09-27 01:02:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:23.650741 | orchestrator | 2025-09-27 01:02:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:23.650766 | orchestrator | 2025-09-27 01:02:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:26.693638 | orchestrator | 2025-09-27 01:02:26 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:26.695049 | orchestrator | 2025-09-27 01:02:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:26.696957 | orchestrator | 2025-09-27 01:02:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:26.697098 | orchestrator | 2025-09-27 01:02:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:29.735059 | orchestrator | 2025-09-27 01:02:29 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:29.735612 | orchestrator | 2025-09-27 01:02:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:29.736885 | orchestrator | 2025-09-27 01:02:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:29.737360 | orchestrator | 2025-09-27 01:02:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:32.786561 | orchestrator | 2025-09-27 01:02:32 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:32.787920 | orchestrator | 2025-09-27 01:02:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:32.789570 | orchestrator | 2025-09-27 01:02:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:32.789641 | orchestrator | 2025-09-27 01:02:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:35.838971 | orchestrator | 2025-09-27 01:02:35 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:35.839539 | orchestrator | 2025-09-27 01:02:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:35.841596 | orchestrator | 2025-09-27 01:02:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:35.841621 | orchestrator | 2025-09-27 01:02:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:38.887980 | orchestrator | 2025-09-27 01:02:38 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:38.889316 | orchestrator | 2025-09-27 01:02:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:38.891575 | orchestrator | 2025-09-27 01:02:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:38.891664 | orchestrator | 2025-09-27 01:02:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:41.937489 | orchestrator | 2025-09-27 01:02:41 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:41.938206 | orchestrator | 2025-09-27 01:02:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:41.939571 | orchestrator | 2025-09-27 01:02:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:41.939767 | orchestrator | 2025-09-27 01:02:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:44.985475 | orchestrator | 2025-09-27 01:02:44 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:44.986837 | orchestrator | 2025-09-27 01:02:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:44.988175 | orchestrator | 2025-09-27 01:02:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:44.988442 | orchestrator | 2025-09-27 01:02:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:48.039236 | orchestrator | 2025-09-27 01:02:48 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:48.040261 | orchestrator | 2025-09-27 01:02:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:48.041700 | orchestrator | 2025-09-27 01:02:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:48.041771 | orchestrator | 2025-09-27 01:02:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:51.076321 | orchestrator | 2025-09-27 01:02:51 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:51.076501 | orchestrator | 2025-09-27 01:02:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:51.077167 | orchestrator | 2025-09-27 01:02:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:51.077192 | orchestrator | 2025-09-27 01:02:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:54.117238 | orchestrator | 2025-09-27 01:02:54 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:54.119489 | orchestrator | 2025-09-27 01:02:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:54.121208 | orchestrator | 2025-09-27 01:02:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:54.121412 | orchestrator | 2025-09-27 01:02:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:02:57.166399 | orchestrator | 2025-09-27 01:02:57 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:02:57.166627 | orchestrator | 2025-09-27 01:02:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:02:57.168723 | orchestrator | 2025-09-27 01:02:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:02:57.168748 | orchestrator | 2025-09-27 01:02:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:00.220633 | orchestrator | 2025-09-27 01:03:00 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:03:00.221679 | orchestrator | 2025-09-27 01:03:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:00.223873 | orchestrator | 2025-09-27 01:03:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:00.223936 | orchestrator | 2025-09-27 01:03:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:03.276600 | orchestrator | 2025-09-27 01:03:03 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:03:03.277845 | orchestrator | 2025-09-27 01:03:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:03.279496 | orchestrator | 2025-09-27 01:03:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:03.279526 | orchestrator | 2025-09-27 01:03:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:06.327364 | orchestrator | 2025-09-27 01:03:06 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:03:06.331632 | orchestrator | 2025-09-27 01:03:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:06.333622 | orchestrator | 2025-09-27 01:03:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:06.333723 | orchestrator | 2025-09-27 01:03:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:09.384893 | orchestrator | 2025-09-27 01:03:09 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:03:09.386966 | orchestrator | 2025-09-27 01:03:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:09.388813 | orchestrator | 2025-09-27 01:03:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:09.389104 | orchestrator | 2025-09-27 01:03:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:12.437140 | orchestrator | 2025-09-27 01:03:12 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:03:12.438551 | orchestrator | 2025-09-27 01:03:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:12.441422 | orchestrator | 2025-09-27 01:03:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:12.441572 | orchestrator | 2025-09-27 01:03:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:15.487652 | orchestrator | 2025-09-27 01:03:15 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:03:15.488991 | orchestrator | 2025-09-27 01:03:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:15.490992 | orchestrator | 2025-09-27 01:03:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:15.491299 | orchestrator | 2025-09-27 01:03:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:18.531630 | orchestrator | 2025-09-27 01:03:18 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state STARTED 2025-09-27 01:03:18.532853 | orchestrator | 2025-09-27 01:03:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:18.534381 | orchestrator | 2025-09-27 01:03:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:18.534407 | orchestrator | 2025-09-27 01:03:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:21.572752 | orchestrator | 2025-09-27 01:03:21 | INFO  | Task e53b93e4-c64c-4396-8a50-60d422b94ec3 is in state SUCCESS 2025-09-27 01:03:21.573764 | orchestrator | 2025-09-27 01:03:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:21.575535 | orchestrator | 2025-09-27 01:03:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:21.575563 | orchestrator | 2025-09-27 01:03:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:24.619475 | orchestrator | 2025-09-27 01:03:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:24.620721 | orchestrator | 2025-09-27 01:03:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:24.620767 | orchestrator | 2025-09-27 01:03:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:27.660878 | orchestrator | 2025-09-27 01:03:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:27.662287 | orchestrator | 2025-09-27 01:03:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:27.662320 | orchestrator | 2025-09-27 01:03:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:30.704726 | orchestrator | 2025-09-27 01:03:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:30.706750 | orchestrator | 2025-09-27 01:03:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:30.706919 | orchestrator | 2025-09-27 01:03:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:33.748810 | orchestrator | 2025-09-27 01:03:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:33.749964 | orchestrator | 2025-09-27 01:03:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:33.750090 | orchestrator | 2025-09-27 01:03:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:36.792865 | orchestrator | 2025-09-27 01:03:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:36.794704 | orchestrator | 2025-09-27 01:03:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:36.794741 | orchestrator | 2025-09-27 01:03:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:39.832566 | orchestrator | 2025-09-27 01:03:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:39.833161 | orchestrator | 2025-09-27 01:03:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:39.833193 | orchestrator | 2025-09-27 01:03:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:42.878353 | orchestrator | 2025-09-27 01:03:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:42.879848 | orchestrator | 2025-09-27 01:03:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:42.879879 | orchestrator | 2025-09-27 01:03:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:45.923397 | orchestrator | 2025-09-27 01:03:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:45.925467 | orchestrator | 2025-09-27 01:03:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:45.925783 | orchestrator | 2025-09-27 01:03:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:48.973940 | orchestrator | 2025-09-27 01:03:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:48.975383 | orchestrator | 2025-09-27 01:03:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:48.975499 | orchestrator | 2025-09-27 01:03:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:52.020086 | orchestrator | 2025-09-27 01:03:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:52.020785 | orchestrator | 2025-09-27 01:03:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:52.020876 | orchestrator | 2025-09-27 01:03:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:55.054802 | orchestrator | 2025-09-27 01:03:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:55.055358 | orchestrator | 2025-09-27 01:03:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:55.055390 | orchestrator | 2025-09-27 01:03:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:03:58.093208 | orchestrator | 2025-09-27 01:03:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:03:58.093899 | orchestrator | 2025-09-27 01:03:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:03:58.093937 | orchestrator | 2025-09-27 01:03:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:01.134425 | orchestrator | 2025-09-27 01:04:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:01.134904 | orchestrator | 2025-09-27 01:04:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:01.135177 | orchestrator | 2025-09-27 01:04:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:04.173817 | orchestrator | 2025-09-27 01:04:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:04.174238 | orchestrator | 2025-09-27 01:04:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:04.174271 | orchestrator | 2025-09-27 01:04:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:07.210928 | orchestrator | 2025-09-27 01:04:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:07.212910 | orchestrator | 2025-09-27 01:04:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:07.213067 | orchestrator | 2025-09-27 01:04:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:10.258965 | orchestrator | 2025-09-27 01:04:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:10.259851 | orchestrator | 2025-09-27 01:04:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:10.259894 | orchestrator | 2025-09-27 01:04:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:13.301625 | orchestrator | 2025-09-27 01:04:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:13.301887 | orchestrator | 2025-09-27 01:04:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:13.301942 | orchestrator | 2025-09-27 01:04:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:16.341910 | orchestrator | 2025-09-27 01:04:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:16.343461 | orchestrator | 2025-09-27 01:04:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:16.343512 | orchestrator | 2025-09-27 01:04:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:19.382549 | orchestrator | 2025-09-27 01:04:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:19.384193 | orchestrator | 2025-09-27 01:04:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:19.384225 | orchestrator | 2025-09-27 01:04:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:22.431855 | orchestrator | 2025-09-27 01:04:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:22.433092 | orchestrator | 2025-09-27 01:04:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:22.433128 | orchestrator | 2025-09-27 01:04:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:25.481768 | orchestrator | 2025-09-27 01:04:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:25.483154 | orchestrator | 2025-09-27 01:04:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:25.483243 | orchestrator | 2025-09-27 01:04:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:28.519156 | orchestrator | 2025-09-27 01:04:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:28.519988 | orchestrator | 2025-09-27 01:04:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:28.520137 | orchestrator | 2025-09-27 01:04:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:31.563891 | orchestrator | 2025-09-27 01:04:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:31.565117 | orchestrator | 2025-09-27 01:04:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:31.565151 | orchestrator | 2025-09-27 01:04:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:34.607293 | orchestrator | 2025-09-27 01:04:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:34.610505 | orchestrator | 2025-09-27 01:04:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:34.610554 | orchestrator | 2025-09-27 01:04:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:37.656803 | orchestrator | 2025-09-27 01:04:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:37.658244 | orchestrator | 2025-09-27 01:04:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:37.658638 | orchestrator | 2025-09-27 01:04:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:40.699543 | orchestrator | 2025-09-27 01:04:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:40.700729 | orchestrator | 2025-09-27 01:04:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:40.700772 | orchestrator | 2025-09-27 01:04:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:43.745731 | orchestrator | 2025-09-27 01:04:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:43.747441 | orchestrator | 2025-09-27 01:04:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:43.747596 | orchestrator | 2025-09-27 01:04:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:46.795627 | orchestrator | 2025-09-27 01:04:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:46.797645 | orchestrator | 2025-09-27 01:04:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:46.797758 | orchestrator | 2025-09-27 01:04:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:49.845912 | orchestrator | 2025-09-27 01:04:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:49.846747 | orchestrator | 2025-09-27 01:04:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:49.847241 | orchestrator | 2025-09-27 01:04:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:52.891716 | orchestrator | 2025-09-27 01:04:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:52.893717 | orchestrator | 2025-09-27 01:04:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:52.893807 | orchestrator | 2025-09-27 01:04:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:55.937042 | orchestrator | 2025-09-27 01:04:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:55.938243 | orchestrator | 2025-09-27 01:04:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:55.938275 | orchestrator | 2025-09-27 01:04:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:04:58.981650 | orchestrator | 2025-09-27 01:04:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:04:58.983747 | orchestrator | 2025-09-27 01:04:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:04:58.983830 | orchestrator | 2025-09-27 01:04:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:02.028704 | orchestrator | 2025-09-27 01:05:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:02.029543 | orchestrator | 2025-09-27 01:05:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:02.029664 | orchestrator | 2025-09-27 01:05:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:05.076916 | orchestrator | 2025-09-27 01:05:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:05.080968 | orchestrator | 2025-09-27 01:05:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:05.081068 | orchestrator | 2025-09-27 01:05:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:08.121193 | orchestrator | 2025-09-27 01:05:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:08.122191 | orchestrator | 2025-09-27 01:05:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:08.122518 | orchestrator | 2025-09-27 01:05:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:11.168880 | orchestrator | 2025-09-27 01:05:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:11.170075 | orchestrator | 2025-09-27 01:05:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:11.170108 | orchestrator | 2025-09-27 01:05:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:14.217080 | orchestrator | 2025-09-27 01:05:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:14.217849 | orchestrator | 2025-09-27 01:05:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:14.217904 | orchestrator | 2025-09-27 01:05:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:17.265052 | orchestrator | 2025-09-27 01:05:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:17.266831 | orchestrator | 2025-09-27 01:05:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:17.266927 | orchestrator | 2025-09-27 01:05:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:20.306919 | orchestrator | 2025-09-27 01:05:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:20.308346 | orchestrator | 2025-09-27 01:05:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:20.308378 | orchestrator | 2025-09-27 01:05:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:23.349979 | orchestrator | 2025-09-27 01:05:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:23.352278 | orchestrator | 2025-09-27 01:05:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:23.352311 | orchestrator | 2025-09-27 01:05:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:26.396479 | orchestrator | 2025-09-27 01:05:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:26.398422 | orchestrator | 2025-09-27 01:05:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:26.398740 | orchestrator | 2025-09-27 01:05:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:29.437766 | orchestrator | 2025-09-27 01:05:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:29.440384 | orchestrator | 2025-09-27 01:05:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:29.440456 | orchestrator | 2025-09-27 01:05:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:32.489087 | orchestrator | 2025-09-27 01:05:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:32.489923 | orchestrator | 2025-09-27 01:05:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:32.489958 | orchestrator | 2025-09-27 01:05:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:35.534426 | orchestrator | 2025-09-27 01:05:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:35.535427 | orchestrator | 2025-09-27 01:05:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:35.535460 | orchestrator | 2025-09-27 01:05:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:38.573925 | orchestrator | 2025-09-27 01:05:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:38.575176 | orchestrator | 2025-09-27 01:05:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:38.575208 | orchestrator | 2025-09-27 01:05:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:41.617875 | orchestrator | 2025-09-27 01:05:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:41.619266 | orchestrator | 2025-09-27 01:05:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:41.619334 | orchestrator | 2025-09-27 01:05:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:44.662448 | orchestrator | 2025-09-27 01:05:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:44.665350 | orchestrator | 2025-09-27 01:05:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:44.665763 | orchestrator | 2025-09-27 01:05:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:47.716397 | orchestrator | 2025-09-27 01:05:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:47.717710 | orchestrator | 2025-09-27 01:05:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:47.717747 | orchestrator | 2025-09-27 01:05:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:50.760159 | orchestrator | 2025-09-27 01:05:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:50.761541 | orchestrator | 2025-09-27 01:05:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:50.761573 | orchestrator | 2025-09-27 01:05:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:53.806931 | orchestrator | 2025-09-27 01:05:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:53.807922 | orchestrator | 2025-09-27 01:05:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:53.807957 | orchestrator | 2025-09-27 01:05:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:56.850446 | orchestrator | 2025-09-27 01:05:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:56.855109 | orchestrator | 2025-09-27 01:05:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:56.855210 | orchestrator | 2025-09-27 01:05:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:05:59.900507 | orchestrator | 2025-09-27 01:05:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:05:59.902934 | orchestrator | 2025-09-27 01:05:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:05:59.903090 | orchestrator | 2025-09-27 01:05:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:02.953265 | orchestrator | 2025-09-27 01:06:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:02.955200 | orchestrator | 2025-09-27 01:06:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:02.955236 | orchestrator | 2025-09-27 01:06:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:05.998800 | orchestrator | 2025-09-27 01:06:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:06.000224 | orchestrator | 2025-09-27 01:06:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:06.000335 | orchestrator | 2025-09-27 01:06:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:09.048669 | orchestrator | 2025-09-27 01:06:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:09.049152 | orchestrator | 2025-09-27 01:06:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:09.049174 | orchestrator | 2025-09-27 01:06:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:12.099799 | orchestrator | 2025-09-27 01:06:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:12.100987 | orchestrator | 2025-09-27 01:06:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:12.101140 | orchestrator | 2025-09-27 01:06:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:15.146460 | orchestrator | 2025-09-27 01:06:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:15.148116 | orchestrator | 2025-09-27 01:06:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:15.148352 | orchestrator | 2025-09-27 01:06:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:18.190342 | orchestrator | 2025-09-27 01:06:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:18.191491 | orchestrator | 2025-09-27 01:06:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:18.191523 | orchestrator | 2025-09-27 01:06:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:21.241611 | orchestrator | 2025-09-27 01:06:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:21.243109 | orchestrator | 2025-09-27 01:06:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:21.243196 | orchestrator | 2025-09-27 01:06:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:24.289714 | orchestrator | 2025-09-27 01:06:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:24.291375 | orchestrator | 2025-09-27 01:06:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:24.291410 | orchestrator | 2025-09-27 01:06:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:27.333424 | orchestrator | 2025-09-27 01:06:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:27.335415 | orchestrator | 2025-09-27 01:06:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:27.335503 | orchestrator | 2025-09-27 01:06:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:30.381099 | orchestrator | 2025-09-27 01:06:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:30.382795 | orchestrator | 2025-09-27 01:06:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:30.383106 | orchestrator | 2025-09-27 01:06:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:33.431057 | orchestrator | 2025-09-27 01:06:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:33.432444 | orchestrator | 2025-09-27 01:06:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:33.432580 | orchestrator | 2025-09-27 01:06:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:36.477220 | orchestrator | 2025-09-27 01:06:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:36.478928 | orchestrator | 2025-09-27 01:06:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:36.479124 | orchestrator | 2025-09-27 01:06:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:39.526269 | orchestrator | 2025-09-27 01:06:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:39.528124 | orchestrator | 2025-09-27 01:06:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:39.528206 | orchestrator | 2025-09-27 01:06:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:42.573939 | orchestrator | 2025-09-27 01:06:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:42.575563 | orchestrator | 2025-09-27 01:06:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:42.575682 | orchestrator | 2025-09-27 01:06:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:45.619964 | orchestrator | 2025-09-27 01:06:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:45.620529 | orchestrator | 2025-09-27 01:06:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:45.620609 | orchestrator | 2025-09-27 01:06:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:48.669638 | orchestrator | 2025-09-27 01:06:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:48.671986 | orchestrator | 2025-09-27 01:06:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:48.672049 | orchestrator | 2025-09-27 01:06:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:51.718797 | orchestrator | 2025-09-27 01:06:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:51.720084 | orchestrator | 2025-09-27 01:06:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:51.720119 | orchestrator | 2025-09-27 01:06:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:54.772379 | orchestrator | 2025-09-27 01:06:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:54.773607 | orchestrator | 2025-09-27 01:06:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:54.773641 | orchestrator | 2025-09-27 01:06:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:06:57.809217 | orchestrator | 2025-09-27 01:06:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:06:57.810767 | orchestrator | 2025-09-27 01:06:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:06:57.810804 | orchestrator | 2025-09-27 01:06:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:00.856118 | orchestrator | 2025-09-27 01:07:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:00.859833 | orchestrator | 2025-09-27 01:07:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:00.859873 | orchestrator | 2025-09-27 01:07:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:03.909036 | orchestrator | 2025-09-27 01:07:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:03.911541 | orchestrator | 2025-09-27 01:07:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:03.911634 | orchestrator | 2025-09-27 01:07:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:06.967464 | orchestrator | 2025-09-27 01:07:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:06.969858 | orchestrator | 2025-09-27 01:07:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:06.970472 | orchestrator | 2025-09-27 01:07:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:10.017981 | orchestrator | 2025-09-27 01:07:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:10.019544 | orchestrator | 2025-09-27 01:07:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:10.019661 | orchestrator | 2025-09-27 01:07:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:13.069304 | orchestrator | 2025-09-27 01:07:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:13.070943 | orchestrator | 2025-09-27 01:07:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:13.071014 | orchestrator | 2025-09-27 01:07:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:16.120920 | orchestrator | 2025-09-27 01:07:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:16.123124 | orchestrator | 2025-09-27 01:07:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:16.123243 | orchestrator | 2025-09-27 01:07:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:19.177216 | orchestrator | 2025-09-27 01:07:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:19.178623 | orchestrator | 2025-09-27 01:07:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:19.178661 | orchestrator | 2025-09-27 01:07:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:22.223095 | orchestrator | 2025-09-27 01:07:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:22.224290 | orchestrator | 2025-09-27 01:07:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:22.224328 | orchestrator | 2025-09-27 01:07:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:25.266558 | orchestrator | 2025-09-27 01:07:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:25.267432 | orchestrator | 2025-09-27 01:07:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:25.267466 | orchestrator | 2025-09-27 01:07:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:28.308483 | orchestrator | 2025-09-27 01:07:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:28.309705 | orchestrator | 2025-09-27 01:07:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:28.309725 | orchestrator | 2025-09-27 01:07:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:31.352910 | orchestrator | 2025-09-27 01:07:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:31.355161 | orchestrator | 2025-09-27 01:07:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:31.355204 | orchestrator | 2025-09-27 01:07:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:34.404845 | orchestrator | 2025-09-27 01:07:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:34.405898 | orchestrator | 2025-09-27 01:07:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:34.405928 | orchestrator | 2025-09-27 01:07:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:37.458119 | orchestrator | 2025-09-27 01:07:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:37.459740 | orchestrator | 2025-09-27 01:07:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:37.459776 | orchestrator | 2025-09-27 01:07:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:40.508325 | orchestrator | 2025-09-27 01:07:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:40.509585 | orchestrator | 2025-09-27 01:07:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:40.509655 | orchestrator | 2025-09-27 01:07:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:43.557359 | orchestrator | 2025-09-27 01:07:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:43.559008 | orchestrator | 2025-09-27 01:07:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:43.559297 | orchestrator | 2025-09-27 01:07:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:46.603207 | orchestrator | 2025-09-27 01:07:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:46.604527 | orchestrator | 2025-09-27 01:07:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:46.604749 | orchestrator | 2025-09-27 01:07:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:49.657424 | orchestrator | 2025-09-27 01:07:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:49.658778 | orchestrator | 2025-09-27 01:07:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:49.658867 | orchestrator | 2025-09-27 01:07:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:52.709138 | orchestrator | 2025-09-27 01:07:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:52.710398 | orchestrator | 2025-09-27 01:07:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:52.710511 | orchestrator | 2025-09-27 01:07:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:55.756264 | orchestrator | 2025-09-27 01:07:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:55.757382 | orchestrator | 2025-09-27 01:07:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:55.757415 | orchestrator | 2025-09-27 01:07:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:07:58.807701 | orchestrator | 2025-09-27 01:07:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:07:58.809776 | orchestrator | 2025-09-27 01:07:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:07:58.810096 | orchestrator | 2025-09-27 01:07:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:01.863371 | orchestrator | 2025-09-27 01:08:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:01.864174 | orchestrator | 2025-09-27 01:08:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:01.864586 | orchestrator | 2025-09-27 01:08:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:04.912359 | orchestrator | 2025-09-27 01:08:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:04.914311 | orchestrator | 2025-09-27 01:08:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:04.914358 | orchestrator | 2025-09-27 01:08:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:07.967241 | orchestrator | 2025-09-27 01:08:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:07.969265 | orchestrator | 2025-09-27 01:08:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:07.969296 | orchestrator | 2025-09-27 01:08:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:11.019578 | orchestrator | 2025-09-27 01:08:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:11.021118 | orchestrator | 2025-09-27 01:08:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:11.021138 | orchestrator | 2025-09-27 01:08:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:14.071771 | orchestrator | 2025-09-27 01:08:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:14.072512 | orchestrator | 2025-09-27 01:08:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:14.072569 | orchestrator | 2025-09-27 01:08:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:17.119780 | orchestrator | 2025-09-27 01:08:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:17.122258 | orchestrator | 2025-09-27 01:08:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:17.122289 | orchestrator | 2025-09-27 01:08:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:20.165618 | orchestrator | 2025-09-27 01:08:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:20.166301 | orchestrator | 2025-09-27 01:08:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:20.166341 | orchestrator | 2025-09-27 01:08:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:23.216242 | orchestrator | 2025-09-27 01:08:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:23.218447 | orchestrator | 2025-09-27 01:08:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:23.218487 | orchestrator | 2025-09-27 01:08:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:26.265914 | orchestrator | 2025-09-27 01:08:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:26.268677 | orchestrator | 2025-09-27 01:08:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:26.268962 | orchestrator | 2025-09-27 01:08:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:29.313831 | orchestrator | 2025-09-27 01:08:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:29.315472 | orchestrator | 2025-09-27 01:08:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:29.315512 | orchestrator | 2025-09-27 01:08:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:32.360477 | orchestrator | 2025-09-27 01:08:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:32.362381 | orchestrator | 2025-09-27 01:08:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:32.362416 | orchestrator | 2025-09-27 01:08:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:35.411785 | orchestrator | 2025-09-27 01:08:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:35.412054 | orchestrator | 2025-09-27 01:08:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:35.412315 | orchestrator | 2025-09-27 01:08:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:38.463520 | orchestrator | 2025-09-27 01:08:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:38.466404 | orchestrator | 2025-09-27 01:08:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:38.466533 | orchestrator | 2025-09-27 01:08:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:41.515340 | orchestrator | 2025-09-27 01:08:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:41.517226 | orchestrator | 2025-09-27 01:08:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:41.517260 | orchestrator | 2025-09-27 01:08:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:44.562789 | orchestrator | 2025-09-27 01:08:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:44.563674 | orchestrator | 2025-09-27 01:08:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:44.564060 | orchestrator | 2025-09-27 01:08:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:47.610634 | orchestrator | 2025-09-27 01:08:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:47.611389 | orchestrator | 2025-09-27 01:08:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:47.611484 | orchestrator | 2025-09-27 01:08:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:50.654150 | orchestrator | 2025-09-27 01:08:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:50.657084 | orchestrator | 2025-09-27 01:08:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:50.657114 | orchestrator | 2025-09-27 01:08:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:53.703719 | orchestrator | 2025-09-27 01:08:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:53.707045 | orchestrator | 2025-09-27 01:08:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:53.707078 | orchestrator | 2025-09-27 01:08:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:56.757863 | orchestrator | 2025-09-27 01:08:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:56.758917 | orchestrator | 2025-09-27 01:08:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:56.758953 | orchestrator | 2025-09-27 01:08:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:08:59.797610 | orchestrator | 2025-09-27 01:08:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:08:59.800175 | orchestrator | 2025-09-27 01:08:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:08:59.800206 | orchestrator | 2025-09-27 01:08:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:02.846123 | orchestrator | 2025-09-27 01:09:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:02.847819 | orchestrator | 2025-09-27 01:09:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:02.848187 | orchestrator | 2025-09-27 01:09:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:05.895459 | orchestrator | 2025-09-27 01:09:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:05.896628 | orchestrator | 2025-09-27 01:09:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:05.896702 | orchestrator | 2025-09-27 01:09:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:08.941135 | orchestrator | 2025-09-27 01:09:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:08.942152 | orchestrator | 2025-09-27 01:09:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:08.942183 | orchestrator | 2025-09-27 01:09:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:11.986599 | orchestrator | 2025-09-27 01:09:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:11.988888 | orchestrator | 2025-09-27 01:09:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:11.989044 | orchestrator | 2025-09-27 01:09:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:15.031771 | orchestrator | 2025-09-27 01:09:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:15.032346 | orchestrator | 2025-09-27 01:09:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:15.032495 | orchestrator | 2025-09-27 01:09:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:18.078251 | orchestrator | 2025-09-27 01:09:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:18.079010 | orchestrator | 2025-09-27 01:09:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:18.079041 | orchestrator | 2025-09-27 01:09:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:21.124839 | orchestrator | 2025-09-27 01:09:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:21.126619 | orchestrator | 2025-09-27 01:09:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:21.126651 | orchestrator | 2025-09-27 01:09:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:24.181942 | orchestrator | 2025-09-27 01:09:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:24.183069 | orchestrator | 2025-09-27 01:09:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:24.183451 | orchestrator | 2025-09-27 01:09:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:27.226564 | orchestrator | 2025-09-27 01:09:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:27.228112 | orchestrator | 2025-09-27 01:09:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:27.228588 | orchestrator | 2025-09-27 01:09:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:30.274270 | orchestrator | 2025-09-27 01:09:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:30.275690 | orchestrator | 2025-09-27 01:09:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:30.275722 | orchestrator | 2025-09-27 01:09:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:33.331489 | orchestrator | 2025-09-27 01:09:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:33.333792 | orchestrator | 2025-09-27 01:09:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:33.333817 | orchestrator | 2025-09-27 01:09:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:36.387569 | orchestrator | 2025-09-27 01:09:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:36.387677 | orchestrator | 2025-09-27 01:09:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:36.387693 | orchestrator | 2025-09-27 01:09:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:39.433979 | orchestrator | 2025-09-27 01:09:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:39.436156 | orchestrator | 2025-09-27 01:09:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:39.436232 | orchestrator | 2025-09-27 01:09:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:42.483681 | orchestrator | 2025-09-27 01:09:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:42.485642 | orchestrator | 2025-09-27 01:09:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:42.485729 | orchestrator | 2025-09-27 01:09:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:45.534848 | orchestrator | 2025-09-27 01:09:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:45.536193 | orchestrator | 2025-09-27 01:09:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:45.536447 | orchestrator | 2025-09-27 01:09:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:48.594392 | orchestrator | 2025-09-27 01:09:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:48.594481 | orchestrator | 2025-09-27 01:09:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:48.594491 | orchestrator | 2025-09-27 01:09:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:51.635059 | orchestrator | 2025-09-27 01:09:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:51.635289 | orchestrator | 2025-09-27 01:09:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:51.635316 | orchestrator | 2025-09-27 01:09:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:54.691737 | orchestrator | 2025-09-27 01:09:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:54.695957 | orchestrator | 2025-09-27 01:09:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:54.696658 | orchestrator | 2025-09-27 01:09:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:09:57.756276 | orchestrator | 2025-09-27 01:09:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:09:57.757665 | orchestrator | 2025-09-27 01:09:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:09:57.757708 | orchestrator | 2025-09-27 01:09:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:00.809417 | orchestrator | 2025-09-27 01:10:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:00.810670 | orchestrator | 2025-09-27 01:10:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:00.810935 | orchestrator | 2025-09-27 01:10:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:03.858885 | orchestrator | 2025-09-27 01:10:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:03.860268 | orchestrator | 2025-09-27 01:10:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:03.860312 | orchestrator | 2025-09-27 01:10:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:06.918289 | orchestrator | 2025-09-27 01:10:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:06.920385 | orchestrator | 2025-09-27 01:10:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:06.920497 | orchestrator | 2025-09-27 01:10:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:09.965510 | orchestrator | 2025-09-27 01:10:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:09.967198 | orchestrator | 2025-09-27 01:10:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:09.967286 | orchestrator | 2025-09-27 01:10:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:13.014604 | orchestrator | 2025-09-27 01:10:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:13.016274 | orchestrator | 2025-09-27 01:10:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:13.016289 | orchestrator | 2025-09-27 01:10:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:16.064772 | orchestrator | 2025-09-27 01:10:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:16.066220 | orchestrator | 2025-09-27 01:10:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:16.066296 | orchestrator | 2025-09-27 01:10:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:19.112667 | orchestrator | 2025-09-27 01:10:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:19.114421 | orchestrator | 2025-09-27 01:10:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:19.114515 | orchestrator | 2025-09-27 01:10:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:22.150738 | orchestrator | 2025-09-27 01:10:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:22.152214 | orchestrator | 2025-09-27 01:10:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:22.152243 | orchestrator | 2025-09-27 01:10:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:25.199253 | orchestrator | 2025-09-27 01:10:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:25.200043 | orchestrator | 2025-09-27 01:10:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:25.200072 | orchestrator | 2025-09-27 01:10:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:28.243690 | orchestrator | 2025-09-27 01:10:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:28.244483 | orchestrator | 2025-09-27 01:10:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:28.244518 | orchestrator | 2025-09-27 01:10:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:31.290190 | orchestrator | 2025-09-27 01:10:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:31.292359 | orchestrator | 2025-09-27 01:10:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:31.292390 | orchestrator | 2025-09-27 01:10:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:34.338845 | orchestrator | 2025-09-27 01:10:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:34.340393 | orchestrator | 2025-09-27 01:10:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:34.340711 | orchestrator | 2025-09-27 01:10:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:37.382708 | orchestrator | 2025-09-27 01:10:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:37.384497 | orchestrator | 2025-09-27 01:10:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:37.384563 | orchestrator | 2025-09-27 01:10:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:40.432579 | orchestrator | 2025-09-27 01:10:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:40.434146 | orchestrator | 2025-09-27 01:10:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:40.434174 | orchestrator | 2025-09-27 01:10:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:43.486264 | orchestrator | 2025-09-27 01:10:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:43.488170 | orchestrator | 2025-09-27 01:10:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:43.488200 | orchestrator | 2025-09-27 01:10:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:46.535919 | orchestrator | 2025-09-27 01:10:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:46.537687 | orchestrator | 2025-09-27 01:10:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:46.537722 | orchestrator | 2025-09-27 01:10:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:49.584954 | orchestrator | 2025-09-27 01:10:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:49.586635 | orchestrator | 2025-09-27 01:10:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:49.586734 | orchestrator | 2025-09-27 01:10:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:52.634552 | orchestrator | 2025-09-27 01:10:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:52.636121 | orchestrator | 2025-09-27 01:10:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:52.636158 | orchestrator | 2025-09-27 01:10:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:55.679536 | orchestrator | 2025-09-27 01:10:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:55.680735 | orchestrator | 2025-09-27 01:10:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:55.680797 | orchestrator | 2025-09-27 01:10:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:10:58.730091 | orchestrator | 2025-09-27 01:10:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:10:58.730875 | orchestrator | 2025-09-27 01:10:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:10:58.731225 | orchestrator | 2025-09-27 01:10:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:01.766761 | orchestrator | 2025-09-27 01:11:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:01.768345 | orchestrator | 2025-09-27 01:11:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:01.768377 | orchestrator | 2025-09-27 01:11:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:04.813753 | orchestrator | 2025-09-27 01:11:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:04.815105 | orchestrator | 2025-09-27 01:11:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:04.815194 | orchestrator | 2025-09-27 01:11:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:07.858419 | orchestrator | 2025-09-27 01:11:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:07.859482 | orchestrator | 2025-09-27 01:11:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:07.859514 | orchestrator | 2025-09-27 01:11:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:10.902156 | orchestrator | 2025-09-27 01:11:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:10.902517 | orchestrator | 2025-09-27 01:11:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:10.902548 | orchestrator | 2025-09-27 01:11:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:13.945709 | orchestrator | 2025-09-27 01:11:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:13.946163 | orchestrator | 2025-09-27 01:11:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:13.946196 | orchestrator | 2025-09-27 01:11:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:16.994488 | orchestrator | 2025-09-27 01:11:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:16.995678 | orchestrator | 2025-09-27 01:11:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:16.995876 | orchestrator | 2025-09-27 01:11:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:20.037434 | orchestrator | 2025-09-27 01:11:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:20.038532 | orchestrator | 2025-09-27 01:11:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:20.038608 | orchestrator | 2025-09-27 01:11:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:23.082905 | orchestrator | 2025-09-27 01:11:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:23.084202 | orchestrator | 2025-09-27 01:11:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:23.084232 | orchestrator | 2025-09-27 01:11:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:26.126249 | orchestrator | 2025-09-27 01:11:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:26.127858 | orchestrator | 2025-09-27 01:11:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:26.127890 | orchestrator | 2025-09-27 01:11:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:29.171266 | orchestrator | 2025-09-27 01:11:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:29.174106 | orchestrator | 2025-09-27 01:11:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:29.174141 | orchestrator | 2025-09-27 01:11:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:32.216354 | orchestrator | 2025-09-27 01:11:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:32.218499 | orchestrator | 2025-09-27 01:11:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:32.218535 | orchestrator | 2025-09-27 01:11:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:35.258891 | orchestrator | 2025-09-27 01:11:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:35.260636 | orchestrator | 2025-09-27 01:11:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:35.261030 | orchestrator | 2025-09-27 01:11:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:38.301190 | orchestrator | 2025-09-27 01:11:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:38.302519 | orchestrator | 2025-09-27 01:11:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:38.302553 | orchestrator | 2025-09-27 01:11:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:41.342177 | orchestrator | 2025-09-27 01:11:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:41.343725 | orchestrator | 2025-09-27 01:11:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:41.343767 | orchestrator | 2025-09-27 01:11:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:44.392881 | orchestrator | 2025-09-27 01:11:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:44.394217 | orchestrator | 2025-09-27 01:11:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:44.394282 | orchestrator | 2025-09-27 01:11:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:47.440601 | orchestrator | 2025-09-27 01:11:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:47.442793 | orchestrator | 2025-09-27 01:11:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:47.442924 | orchestrator | 2025-09-27 01:11:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:50.486919 | orchestrator | 2025-09-27 01:11:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:50.488301 | orchestrator | 2025-09-27 01:11:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:50.488332 | orchestrator | 2025-09-27 01:11:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:53.532665 | orchestrator | 2025-09-27 01:11:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:53.533960 | orchestrator | 2025-09-27 01:11:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:53.534118 | orchestrator | 2025-09-27 01:11:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:56.577447 | orchestrator | 2025-09-27 01:11:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:56.579288 | orchestrator | 2025-09-27 01:11:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:56.579320 | orchestrator | 2025-09-27 01:11:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:11:59.622770 | orchestrator | 2025-09-27 01:11:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:11:59.625496 | orchestrator | 2025-09-27 01:11:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:11:59.625585 | orchestrator | 2025-09-27 01:11:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:02.673833 | orchestrator | 2025-09-27 01:12:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:02.676466 | orchestrator | 2025-09-27 01:12:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:02.676586 | orchestrator | 2025-09-27 01:12:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:05.728041 | orchestrator | 2025-09-27 01:12:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:05.731362 | orchestrator | 2025-09-27 01:12:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:05.731504 | orchestrator | 2025-09-27 01:12:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:08.782463 | orchestrator | 2025-09-27 01:12:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:08.785603 | orchestrator | 2025-09-27 01:12:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:08.785636 | orchestrator | 2025-09-27 01:12:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:11.834419 | orchestrator | 2025-09-27 01:12:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:11.835818 | orchestrator | 2025-09-27 01:12:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:11.835851 | orchestrator | 2025-09-27 01:12:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:14.883545 | orchestrator | 2025-09-27 01:12:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:14.885166 | orchestrator | 2025-09-27 01:12:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:14.885198 | orchestrator | 2025-09-27 01:12:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:17.933621 | orchestrator | 2025-09-27 01:12:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:17.935244 | orchestrator | 2025-09-27 01:12:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:17.935325 | orchestrator | 2025-09-27 01:12:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:20.979624 | orchestrator | 2025-09-27 01:12:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:20.980221 | orchestrator | 2025-09-27 01:12:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:20.980252 | orchestrator | 2025-09-27 01:12:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:24.024158 | orchestrator | 2025-09-27 01:12:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:24.024463 | orchestrator | 2025-09-27 01:12:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:24.024597 | orchestrator | 2025-09-27 01:12:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:27.063739 | orchestrator | 2025-09-27 01:12:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:27.064698 | orchestrator | 2025-09-27 01:12:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:27.064729 | orchestrator | 2025-09-27 01:12:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:30.105082 | orchestrator | 2025-09-27 01:12:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:30.106479 | orchestrator | 2025-09-27 01:12:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:30.106512 | orchestrator | 2025-09-27 01:12:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:33.157653 | orchestrator | 2025-09-27 01:12:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:33.158593 | orchestrator | 2025-09-27 01:12:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:33.158901 | orchestrator | 2025-09-27 01:12:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:36.209445 | orchestrator | 2025-09-27 01:12:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:36.211402 | orchestrator | 2025-09-27 01:12:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:36.211736 | orchestrator | 2025-09-27 01:12:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:39.260502 | orchestrator | 2025-09-27 01:12:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:39.260616 | orchestrator | 2025-09-27 01:12:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:39.260632 | orchestrator | 2025-09-27 01:12:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:42.319952 | orchestrator | 2025-09-27 01:12:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:42.321588 | orchestrator | 2025-09-27 01:12:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:42.321683 | orchestrator | 2025-09-27 01:12:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:45.379551 | orchestrator | 2025-09-27 01:12:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:45.382080 | orchestrator | 2025-09-27 01:12:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:45.382125 | orchestrator | 2025-09-27 01:12:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:48.449549 | orchestrator | 2025-09-27 01:12:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:48.451225 | orchestrator | 2025-09-27 01:12:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:48.451417 | orchestrator | 2025-09-27 01:12:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:51.505555 | orchestrator | 2025-09-27 01:12:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:51.507507 | orchestrator | 2025-09-27 01:12:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:51.507672 | orchestrator | 2025-09-27 01:12:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:54.561085 | orchestrator | 2025-09-27 01:12:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:54.564403 | orchestrator | 2025-09-27 01:12:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:54.565796 | orchestrator | 2025-09-27 01:12:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:12:57.606753 | orchestrator | 2025-09-27 01:12:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:12:57.609150 | orchestrator | 2025-09-27 01:12:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:12:57.609195 | orchestrator | 2025-09-27 01:12:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:00.656611 | orchestrator | 2025-09-27 01:13:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:00.658671 | orchestrator | 2025-09-27 01:13:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:00.658705 | orchestrator | 2025-09-27 01:13:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:03.756130 | orchestrator | 2025-09-27 01:13:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:03.757834 | orchestrator | 2025-09-27 01:13:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:03.757870 | orchestrator | 2025-09-27 01:13:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:06.805609 | orchestrator | 2025-09-27 01:13:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:06.807047 | orchestrator | 2025-09-27 01:13:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:06.807084 | orchestrator | 2025-09-27 01:13:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:09.871021 | orchestrator | 2025-09-27 01:13:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:09.871764 | orchestrator | 2025-09-27 01:13:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:09.871856 | orchestrator | 2025-09-27 01:13:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:12.922420 | orchestrator | 2025-09-27 01:13:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:12.924333 | orchestrator | 2025-09-27 01:13:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:12.924366 | orchestrator | 2025-09-27 01:13:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:15.983904 | orchestrator | 2025-09-27 01:13:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:15.985904 | orchestrator | 2025-09-27 01:13:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:15.985913 | orchestrator | 2025-09-27 01:13:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:19.042456 | orchestrator | 2025-09-27 01:13:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:19.044173 | orchestrator | 2025-09-27 01:13:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:19.044292 | orchestrator | 2025-09-27 01:13:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:22.098804 | orchestrator | 2025-09-27 01:13:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:22.101140 | orchestrator | 2025-09-27 01:13:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:22.101218 | orchestrator | 2025-09-27 01:13:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:25.149345 | orchestrator | 2025-09-27 01:13:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:25.150543 | orchestrator | 2025-09-27 01:13:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:25.150761 | orchestrator | 2025-09-27 01:13:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:28.200471 | orchestrator | 2025-09-27 01:13:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:28.203657 | orchestrator | 2025-09-27 01:13:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:28.203761 | orchestrator | 2025-09-27 01:13:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:31.249395 | orchestrator | 2025-09-27 01:13:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:31.250595 | orchestrator | 2025-09-27 01:13:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:31.250629 | orchestrator | 2025-09-27 01:13:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:34.299305 | orchestrator | 2025-09-27 01:13:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:34.300939 | orchestrator | 2025-09-27 01:13:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:34.301039 | orchestrator | 2025-09-27 01:13:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:37.344586 | orchestrator | 2025-09-27 01:13:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:37.346097 | orchestrator | 2025-09-27 01:13:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:37.346132 | orchestrator | 2025-09-27 01:13:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:40.389919 | orchestrator | 2025-09-27 01:13:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:40.390700 | orchestrator | 2025-09-27 01:13:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:40.390734 | orchestrator | 2025-09-27 01:13:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:43.435855 | orchestrator | 2025-09-27 01:13:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:43.437368 | orchestrator | 2025-09-27 01:13:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:43.437402 | orchestrator | 2025-09-27 01:13:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:46.485227 | orchestrator | 2025-09-27 01:13:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:46.486275 | orchestrator | 2025-09-27 01:13:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:46.486313 | orchestrator | 2025-09-27 01:13:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:49.529249 | orchestrator | 2025-09-27 01:13:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:49.530608 | orchestrator | 2025-09-27 01:13:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:49.530645 | orchestrator | 2025-09-27 01:13:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:52.571440 | orchestrator | 2025-09-27 01:13:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:52.572410 | orchestrator | 2025-09-27 01:13:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:52.572629 | orchestrator | 2025-09-27 01:13:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:55.618955 | orchestrator | 2025-09-27 01:13:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:55.620467 | orchestrator | 2025-09-27 01:13:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:55.620509 | orchestrator | 2025-09-27 01:13:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:13:58.674561 | orchestrator | 2025-09-27 01:13:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:13:58.676444 | orchestrator | 2025-09-27 01:13:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:13:58.676476 | orchestrator | 2025-09-27 01:13:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:01.727419 | orchestrator | 2025-09-27 01:14:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:01.728346 | orchestrator | 2025-09-27 01:14:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:01.728400 | orchestrator | 2025-09-27 01:14:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:04.776770 | orchestrator | 2025-09-27 01:14:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:04.780574 | orchestrator | 2025-09-27 01:14:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:04.780854 | orchestrator | 2025-09-27 01:14:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:07.825344 | orchestrator | 2025-09-27 01:14:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:07.826719 | orchestrator | 2025-09-27 01:14:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:07.826993 | orchestrator | 2025-09-27 01:14:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:10.871845 | orchestrator | 2025-09-27 01:14:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:10.873281 | orchestrator | 2025-09-27 01:14:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:10.873366 | orchestrator | 2025-09-27 01:14:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:13.918755 | orchestrator | 2025-09-27 01:14:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:13.920454 | orchestrator | 2025-09-27 01:14:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:13.920490 | orchestrator | 2025-09-27 01:14:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:16.966762 | orchestrator | 2025-09-27 01:14:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:16.968307 | orchestrator | 2025-09-27 01:14:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:16.968398 | orchestrator | 2025-09-27 01:14:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:20.018632 | orchestrator | 2025-09-27 01:14:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:20.020306 | orchestrator | 2025-09-27 01:14:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:20.020420 | orchestrator | 2025-09-27 01:14:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:23.069375 | orchestrator | 2025-09-27 01:14:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:23.071009 | orchestrator | 2025-09-27 01:14:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:23.071299 | orchestrator | 2025-09-27 01:14:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:26.121044 | orchestrator | 2025-09-27 01:14:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:26.121855 | orchestrator | 2025-09-27 01:14:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:26.121891 | orchestrator | 2025-09-27 01:14:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:29.170806 | orchestrator | 2025-09-27 01:14:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:29.171733 | orchestrator | 2025-09-27 01:14:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:29.171939 | orchestrator | 2025-09-27 01:14:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:32.224661 | orchestrator | 2025-09-27 01:14:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:32.226505 | orchestrator | 2025-09-27 01:14:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:32.226537 | orchestrator | 2025-09-27 01:14:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:35.271455 | orchestrator | 2025-09-27 01:14:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:35.273406 | orchestrator | 2025-09-27 01:14:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:35.273665 | orchestrator | 2025-09-27 01:14:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:38.323348 | orchestrator | 2025-09-27 01:14:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:38.325168 | orchestrator | 2025-09-27 01:14:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:38.325406 | orchestrator | 2025-09-27 01:14:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:41.374781 | orchestrator | 2025-09-27 01:14:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:41.376020 | orchestrator | 2025-09-27 01:14:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:41.376097 | orchestrator | 2025-09-27 01:14:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:44.425175 | orchestrator | 2025-09-27 01:14:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:44.426544 | orchestrator | 2025-09-27 01:14:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:44.426577 | orchestrator | 2025-09-27 01:14:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:47.472741 | orchestrator | 2025-09-27 01:14:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:47.474411 | orchestrator | 2025-09-27 01:14:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:47.474450 | orchestrator | 2025-09-27 01:14:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:50.519465 | orchestrator | 2025-09-27 01:14:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:50.520191 | orchestrator | 2025-09-27 01:14:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:50.520224 | orchestrator | 2025-09-27 01:14:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:53.565796 | orchestrator | 2025-09-27 01:14:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:53.567878 | orchestrator | 2025-09-27 01:14:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:53.568189 | orchestrator | 2025-09-27 01:14:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:56.609120 | orchestrator | 2025-09-27 01:14:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:56.611259 | orchestrator | 2025-09-27 01:14:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:56.611297 | orchestrator | 2025-09-27 01:14:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:14:59.650171 | orchestrator | 2025-09-27 01:14:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:14:59.652227 | orchestrator | 2025-09-27 01:14:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:14:59.652299 | orchestrator | 2025-09-27 01:14:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:02.699486 | orchestrator | 2025-09-27 01:15:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:02.701179 | orchestrator | 2025-09-27 01:15:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:02.701301 | orchestrator | 2025-09-27 01:15:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:05.748718 | orchestrator | 2025-09-27 01:15:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:05.750191 | orchestrator | 2025-09-27 01:15:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:05.750219 | orchestrator | 2025-09-27 01:15:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:08.798439 | orchestrator | 2025-09-27 01:15:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:08.799434 | orchestrator | 2025-09-27 01:15:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:08.799466 | orchestrator | 2025-09-27 01:15:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:11.843317 | orchestrator | 2025-09-27 01:15:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:11.845178 | orchestrator | 2025-09-27 01:15:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:11.845208 | orchestrator | 2025-09-27 01:15:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:14.891723 | orchestrator | 2025-09-27 01:15:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:14.894203 | orchestrator | 2025-09-27 01:15:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:14.894244 | orchestrator | 2025-09-27 01:15:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:17.936012 | orchestrator | 2025-09-27 01:15:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:17.937760 | orchestrator | 2025-09-27 01:15:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:17.937803 | orchestrator | 2025-09-27 01:15:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:20.987256 | orchestrator | 2025-09-27 01:15:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:20.988516 | orchestrator | 2025-09-27 01:15:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:20.989080 | orchestrator | 2025-09-27 01:15:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:24.042490 | orchestrator | 2025-09-27 01:15:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:24.043792 | orchestrator | 2025-09-27 01:15:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:24.043829 | orchestrator | 2025-09-27 01:15:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:27.093104 | orchestrator | 2025-09-27 01:15:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:27.094601 | orchestrator | 2025-09-27 01:15:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:27.094643 | orchestrator | 2025-09-27 01:15:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:30.133393 | orchestrator | 2025-09-27 01:15:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:30.135025 | orchestrator | 2025-09-27 01:15:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:30.135294 | orchestrator | 2025-09-27 01:15:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:33.187435 | orchestrator | 2025-09-27 01:15:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:33.189229 | orchestrator | 2025-09-27 01:15:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:33.189257 | orchestrator | 2025-09-27 01:15:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:36.233194 | orchestrator | 2025-09-27 01:15:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:36.235345 | orchestrator | 2025-09-27 01:15:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:36.235706 | orchestrator | 2025-09-27 01:15:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:39.279365 | orchestrator | 2025-09-27 01:15:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:39.280924 | orchestrator | 2025-09-27 01:15:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:39.281136 | orchestrator | 2025-09-27 01:15:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:42.326937 | orchestrator | 2025-09-27 01:15:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:42.328326 | orchestrator | 2025-09-27 01:15:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:42.328356 | orchestrator | 2025-09-27 01:15:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:45.371851 | orchestrator | 2025-09-27 01:15:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:45.373841 | orchestrator | 2025-09-27 01:15:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:45.373957 | orchestrator | 2025-09-27 01:15:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:48.419240 | orchestrator | 2025-09-27 01:15:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:48.420841 | orchestrator | 2025-09-27 01:15:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:48.420874 | orchestrator | 2025-09-27 01:15:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:51.461755 | orchestrator | 2025-09-27 01:15:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:51.463527 | orchestrator | 2025-09-27 01:15:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:51.463558 | orchestrator | 2025-09-27 01:15:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:54.511245 | orchestrator | 2025-09-27 01:15:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:54.513503 | orchestrator | 2025-09-27 01:15:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:54.513543 | orchestrator | 2025-09-27 01:15:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:15:57.563643 | orchestrator | 2025-09-27 01:15:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:15:57.563749 | orchestrator | 2025-09-27 01:15:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:15:57.563773 | orchestrator | 2025-09-27 01:15:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:00.604961 | orchestrator | 2025-09-27 01:16:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:00.608402 | orchestrator | 2025-09-27 01:16:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:00.608441 | orchestrator | 2025-09-27 01:16:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:03.653519 | orchestrator | 2025-09-27 01:16:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:03.655600 | orchestrator | 2025-09-27 01:16:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:03.655843 | orchestrator | 2025-09-27 01:16:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:06.702835 | orchestrator | 2025-09-27 01:16:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:06.704628 | orchestrator | 2025-09-27 01:16:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:06.704659 | orchestrator | 2025-09-27 01:16:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:09.750683 | orchestrator | 2025-09-27 01:16:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:09.751964 | orchestrator | 2025-09-27 01:16:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:09.752125 | orchestrator | 2025-09-27 01:16:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:12.795256 | orchestrator | 2025-09-27 01:16:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:12.796115 | orchestrator | 2025-09-27 01:16:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:12.796192 | orchestrator | 2025-09-27 01:16:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:15.843682 | orchestrator | 2025-09-27 01:16:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:15.845156 | orchestrator | 2025-09-27 01:16:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:15.845205 | orchestrator | 2025-09-27 01:16:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:18.889721 | orchestrator | 2025-09-27 01:16:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:18.890620 | orchestrator | 2025-09-27 01:16:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:18.890654 | orchestrator | 2025-09-27 01:16:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:21.930864 | orchestrator | 2025-09-27 01:16:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:21.932664 | orchestrator | 2025-09-27 01:16:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:21.932715 | orchestrator | 2025-09-27 01:16:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:24.976157 | orchestrator | 2025-09-27 01:16:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:24.977944 | orchestrator | 2025-09-27 01:16:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:24.978006 | orchestrator | 2025-09-27 01:16:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:28.024283 | orchestrator | 2025-09-27 01:16:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:28.028256 | orchestrator | 2025-09-27 01:16:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:28.028793 | orchestrator | 2025-09-27 01:16:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:31.072546 | orchestrator | 2025-09-27 01:16:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:31.076050 | orchestrator | 2025-09-27 01:16:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:31.076105 | orchestrator | 2025-09-27 01:16:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:34.123720 | orchestrator | 2025-09-27 01:16:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:34.125707 | orchestrator | 2025-09-27 01:16:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:34.125749 | orchestrator | 2025-09-27 01:16:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:37.171258 | orchestrator | 2025-09-27 01:16:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:37.172673 | orchestrator | 2025-09-27 01:16:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:37.172827 | orchestrator | 2025-09-27 01:16:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:40.221723 | orchestrator | 2025-09-27 01:16:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:40.222560 | orchestrator | 2025-09-27 01:16:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:40.222683 | orchestrator | 2025-09-27 01:16:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:43.268445 | orchestrator | 2025-09-27 01:16:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:43.270125 | orchestrator | 2025-09-27 01:16:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:43.270213 | orchestrator | 2025-09-27 01:16:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:46.316799 | orchestrator | 2025-09-27 01:16:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:46.318252 | orchestrator | 2025-09-27 01:16:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:46.318287 | orchestrator | 2025-09-27 01:16:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:49.360498 | orchestrator | 2025-09-27 01:16:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:49.362772 | orchestrator | 2025-09-27 01:16:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:49.362867 | orchestrator | 2025-09-27 01:16:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:52.410275 | orchestrator | 2025-09-27 01:16:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:52.411877 | orchestrator | 2025-09-27 01:16:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:52.411909 | orchestrator | 2025-09-27 01:16:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:55.459482 | orchestrator | 2025-09-27 01:16:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:55.460847 | orchestrator | 2025-09-27 01:16:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:55.460868 | orchestrator | 2025-09-27 01:16:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:16:58.503209 | orchestrator | 2025-09-27 01:16:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:16:58.504659 | orchestrator | 2025-09-27 01:16:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:16:58.504692 | orchestrator | 2025-09-27 01:16:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:01.548339 | orchestrator | 2025-09-27 01:17:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:01.550173 | orchestrator | 2025-09-27 01:17:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:01.550210 | orchestrator | 2025-09-27 01:17:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:04.600617 | orchestrator | 2025-09-27 01:17:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:04.604342 | orchestrator | 2025-09-27 01:17:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:04.604497 | orchestrator | 2025-09-27 01:17:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:07.655550 | orchestrator | 2025-09-27 01:17:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:07.659342 | orchestrator | 2025-09-27 01:17:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:07.659509 | orchestrator | 2025-09-27 01:17:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:10.707402 | orchestrator | 2025-09-27 01:17:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:10.708447 | orchestrator | 2025-09-27 01:17:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:10.708529 | orchestrator | 2025-09-27 01:17:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:13.753623 | orchestrator | 2025-09-27 01:17:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:13.755806 | orchestrator | 2025-09-27 01:17:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:13.755895 | orchestrator | 2025-09-27 01:17:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:16.802237 | orchestrator | 2025-09-27 01:17:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:16.803621 | orchestrator | 2025-09-27 01:17:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:16.803697 | orchestrator | 2025-09-27 01:17:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:19.852151 | orchestrator | 2025-09-27 01:17:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:19.853762 | orchestrator | 2025-09-27 01:17:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:19.853794 | orchestrator | 2025-09-27 01:17:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:22.900613 | orchestrator | 2025-09-27 01:17:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:22.902423 | orchestrator | 2025-09-27 01:17:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:22.902529 | orchestrator | 2025-09-27 01:17:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:25.944687 | orchestrator | 2025-09-27 01:17:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:25.946406 | orchestrator | 2025-09-27 01:17:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:25.946498 | orchestrator | 2025-09-27 01:17:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:28.990294 | orchestrator | 2025-09-27 01:17:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:28.991846 | orchestrator | 2025-09-27 01:17:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:28.991873 | orchestrator | 2025-09-27 01:17:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:32.034531 | orchestrator | 2025-09-27 01:17:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:32.036171 | orchestrator | 2025-09-27 01:17:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:32.036467 | orchestrator | 2025-09-27 01:17:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:35.077161 | orchestrator | 2025-09-27 01:17:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:35.078740 | orchestrator | 2025-09-27 01:17:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:35.078768 | orchestrator | 2025-09-27 01:17:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:38.128766 | orchestrator | 2025-09-27 01:17:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:38.130176 | orchestrator | 2025-09-27 01:17:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:38.130381 | orchestrator | 2025-09-27 01:17:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:41.178594 | orchestrator | 2025-09-27 01:17:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:41.179958 | orchestrator | 2025-09-27 01:17:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:41.180346 | orchestrator | 2025-09-27 01:17:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:44.228790 | orchestrator | 2025-09-27 01:17:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:44.229712 | orchestrator | 2025-09-27 01:17:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:44.229863 | orchestrator | 2025-09-27 01:17:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:47.275845 | orchestrator | 2025-09-27 01:17:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:47.277192 | orchestrator | 2025-09-27 01:17:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:47.277226 | orchestrator | 2025-09-27 01:17:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:50.320141 | orchestrator | 2025-09-27 01:17:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:50.322555 | orchestrator | 2025-09-27 01:17:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:50.322631 | orchestrator | 2025-09-27 01:17:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:53.370914 | orchestrator | 2025-09-27 01:17:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:53.372450 | orchestrator | 2025-09-27 01:17:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:53.372756 | orchestrator | 2025-09-27 01:17:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:56.414405 | orchestrator | 2025-09-27 01:17:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:56.415378 | orchestrator | 2025-09-27 01:17:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:56.415508 | orchestrator | 2025-09-27 01:17:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:17:59.457020 | orchestrator | 2025-09-27 01:17:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:17:59.458740 | orchestrator | 2025-09-27 01:17:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:17:59.458854 | orchestrator | 2025-09-27 01:17:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:02.501099 | orchestrator | 2025-09-27 01:18:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:02.502571 | orchestrator | 2025-09-27 01:18:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:02.502982 | orchestrator | 2025-09-27 01:18:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:05.549264 | orchestrator | 2025-09-27 01:18:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:05.550165 | orchestrator | 2025-09-27 01:18:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:05.550189 | orchestrator | 2025-09-27 01:18:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:08.593768 | orchestrator | 2025-09-27 01:18:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:08.594716 | orchestrator | 2025-09-27 01:18:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:08.594811 | orchestrator | 2025-09-27 01:18:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:11.639571 | orchestrator | 2025-09-27 01:18:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:11.640260 | orchestrator | 2025-09-27 01:18:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:11.640530 | orchestrator | 2025-09-27 01:18:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:14.687567 | orchestrator | 2025-09-27 01:18:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:14.690691 | orchestrator | 2025-09-27 01:18:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:14.690797 | orchestrator | 2025-09-27 01:18:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:17.739634 | orchestrator | 2025-09-27 01:18:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:17.741432 | orchestrator | 2025-09-27 01:18:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:17.741458 | orchestrator | 2025-09-27 01:18:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:20.787145 | orchestrator | 2025-09-27 01:18:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:20.789264 | orchestrator | 2025-09-27 01:18:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:20.789507 | orchestrator | 2025-09-27 01:18:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:23.841558 | orchestrator | 2025-09-27 01:18:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:23.843157 | orchestrator | 2025-09-27 01:18:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:23.843186 | orchestrator | 2025-09-27 01:18:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:26.889549 | orchestrator | 2025-09-27 01:18:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:26.891221 | orchestrator | 2025-09-27 01:18:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:26.891252 | orchestrator | 2025-09-27 01:18:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:29.940460 | orchestrator | 2025-09-27 01:18:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:29.941727 | orchestrator | 2025-09-27 01:18:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:29.941757 | orchestrator | 2025-09-27 01:18:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:32.991214 | orchestrator | 2025-09-27 01:18:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:32.992364 | orchestrator | 2025-09-27 01:18:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:32.992432 | orchestrator | 2025-09-27 01:18:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:36.050725 | orchestrator | 2025-09-27 01:18:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:36.050811 | orchestrator | 2025-09-27 01:18:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:36.050823 | orchestrator | 2025-09-27 01:18:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:39.087788 | orchestrator | 2025-09-27 01:18:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:39.088763 | orchestrator | 2025-09-27 01:18:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:39.088805 | orchestrator | 2025-09-27 01:18:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:42.143236 | orchestrator | 2025-09-27 01:18:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:42.144923 | orchestrator | 2025-09-27 01:18:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:42.145008 | orchestrator | 2025-09-27 01:18:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:45.187155 | orchestrator | 2025-09-27 01:18:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:45.188736 | orchestrator | 2025-09-27 01:18:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:45.188764 | orchestrator | 2025-09-27 01:18:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:48.247299 | orchestrator | 2025-09-27 01:18:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:48.247398 | orchestrator | 2025-09-27 01:18:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:48.247412 | orchestrator | 2025-09-27 01:18:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:51.300659 | orchestrator | 2025-09-27 01:18:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:51.302619 | orchestrator | 2025-09-27 01:18:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:51.302726 | orchestrator | 2025-09-27 01:18:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:54.354292 | orchestrator | 2025-09-27 01:18:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:54.355926 | orchestrator | 2025-09-27 01:18:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:54.356174 | orchestrator | 2025-09-27 01:18:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:18:57.404732 | orchestrator | 2025-09-27 01:18:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:18:57.406539 | orchestrator | 2025-09-27 01:18:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:18:57.406611 | orchestrator | 2025-09-27 01:18:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:00.454695 | orchestrator | 2025-09-27 01:19:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:00.455317 | orchestrator | 2025-09-27 01:19:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:00.455350 | orchestrator | 2025-09-27 01:19:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:03.505811 | orchestrator | 2025-09-27 01:19:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:03.507219 | orchestrator | 2025-09-27 01:19:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:03.507305 | orchestrator | 2025-09-27 01:19:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:06.554365 | orchestrator | 2025-09-27 01:19:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:06.556393 | orchestrator | 2025-09-27 01:19:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:06.556816 | orchestrator | 2025-09-27 01:19:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:09.610927 | orchestrator | 2025-09-27 01:19:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:09.612071 | orchestrator | 2025-09-27 01:19:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:09.612317 | orchestrator | 2025-09-27 01:19:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:12.659503 | orchestrator | 2025-09-27 01:19:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:12.661698 | orchestrator | 2025-09-27 01:19:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:12.661808 | orchestrator | 2025-09-27 01:19:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:15.709244 | orchestrator | 2025-09-27 01:19:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:15.710684 | orchestrator | 2025-09-27 01:19:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:15.710714 | orchestrator | 2025-09-27 01:19:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:18.760616 | orchestrator | 2025-09-27 01:19:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:18.761431 | orchestrator | 2025-09-27 01:19:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:18.761459 | orchestrator | 2025-09-27 01:19:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:21.810235 | orchestrator | 2025-09-27 01:19:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:21.812146 | orchestrator | 2025-09-27 01:19:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:21.812173 | orchestrator | 2025-09-27 01:19:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:24.855709 | orchestrator | 2025-09-27 01:19:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:24.858063 | orchestrator | 2025-09-27 01:19:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:24.858105 | orchestrator | 2025-09-27 01:19:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:27.906282 | orchestrator | 2025-09-27 01:19:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:27.907716 | orchestrator | 2025-09-27 01:19:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:27.907777 | orchestrator | 2025-09-27 01:19:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:30.948594 | orchestrator | 2025-09-27 01:19:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:30.950526 | orchestrator | 2025-09-27 01:19:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:30.950602 | orchestrator | 2025-09-27 01:19:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:34.000226 | orchestrator | 2025-09-27 01:19:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:34.007262 | orchestrator | 2025-09-27 01:19:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:34.007343 | orchestrator | 2025-09-27 01:19:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:37.055726 | orchestrator | 2025-09-27 01:19:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:37.057352 | orchestrator | 2025-09-27 01:19:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:37.057429 | orchestrator | 2025-09-27 01:19:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:40.102798 | orchestrator | 2025-09-27 01:19:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:40.105630 | orchestrator | 2025-09-27 01:19:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:40.105665 | orchestrator | 2025-09-27 01:19:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:43.157751 | orchestrator | 2025-09-27 01:19:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:43.158918 | orchestrator | 2025-09-27 01:19:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:43.158944 | orchestrator | 2025-09-27 01:19:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:46.210746 | orchestrator | 2025-09-27 01:19:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:46.212357 | orchestrator | 2025-09-27 01:19:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:46.212678 | orchestrator | 2025-09-27 01:19:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:49.261143 | orchestrator | 2025-09-27 01:19:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:49.262186 | orchestrator | 2025-09-27 01:19:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:49.262212 | orchestrator | 2025-09-27 01:19:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:52.310361 | orchestrator | 2025-09-27 01:19:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:52.311595 | orchestrator | 2025-09-27 01:19:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:52.311614 | orchestrator | 2025-09-27 01:19:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:55.361624 | orchestrator | 2025-09-27 01:19:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:55.365576 | orchestrator | 2025-09-27 01:19:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:55.365640 | orchestrator | 2025-09-27 01:19:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:19:58.415613 | orchestrator | 2025-09-27 01:19:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:19:58.417259 | orchestrator | 2025-09-27 01:19:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:19:58.417337 | orchestrator | 2025-09-27 01:19:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:01.463077 | orchestrator | 2025-09-27 01:20:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:01.464485 | orchestrator | 2025-09-27 01:20:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:01.464689 | orchestrator | 2025-09-27 01:20:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:04.514634 | orchestrator | 2025-09-27 01:20:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:04.516991 | orchestrator | 2025-09-27 01:20:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:04.517019 | orchestrator | 2025-09-27 01:20:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:07.564990 | orchestrator | 2025-09-27 01:20:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:07.566921 | orchestrator | 2025-09-27 01:20:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:07.566946 | orchestrator | 2025-09-27 01:20:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:10.614766 | orchestrator | 2025-09-27 01:20:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:10.617203 | orchestrator | 2025-09-27 01:20:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:10.617286 | orchestrator | 2025-09-27 01:20:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:13.665087 | orchestrator | 2025-09-27 01:20:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:13.667406 | orchestrator | 2025-09-27 01:20:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:13.667456 | orchestrator | 2025-09-27 01:20:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:16.715893 | orchestrator | 2025-09-27 01:20:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:16.717177 | orchestrator | 2025-09-27 01:20:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:16.717350 | orchestrator | 2025-09-27 01:20:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:19.766396 | orchestrator | 2025-09-27 01:20:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:19.768898 | orchestrator | 2025-09-27 01:20:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:19.768943 | orchestrator | 2025-09-27 01:20:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:22.814768 | orchestrator | 2025-09-27 01:20:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:22.817011 | orchestrator | 2025-09-27 01:20:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:22.817061 | orchestrator | 2025-09-27 01:20:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:25.859178 | orchestrator | 2025-09-27 01:20:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:25.860192 | orchestrator | 2025-09-27 01:20:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:25.860572 | orchestrator | 2025-09-27 01:20:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:28.908218 | orchestrator | 2025-09-27 01:20:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:28.909892 | orchestrator | 2025-09-27 01:20:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:28.909991 | orchestrator | 2025-09-27 01:20:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:31.956907 | orchestrator | 2025-09-27 01:20:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:31.960332 | orchestrator | 2025-09-27 01:20:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:31.960385 | orchestrator | 2025-09-27 01:20:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:35.011473 | orchestrator | 2025-09-27 01:20:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:35.012240 | orchestrator | 2025-09-27 01:20:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:35.012276 | orchestrator | 2025-09-27 01:20:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:38.057229 | orchestrator | 2025-09-27 01:20:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:38.058811 | orchestrator | 2025-09-27 01:20:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:38.058846 | orchestrator | 2025-09-27 01:20:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:41.101441 | orchestrator | 2025-09-27 01:20:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:41.102759 | orchestrator | 2025-09-27 01:20:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:41.102801 | orchestrator | 2025-09-27 01:20:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:44.149058 | orchestrator | 2025-09-27 01:20:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:44.151401 | orchestrator | 2025-09-27 01:20:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:44.151832 | orchestrator | 2025-09-27 01:20:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:47.203411 | orchestrator | 2025-09-27 01:20:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:47.205080 | orchestrator | 2025-09-27 01:20:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:47.205254 | orchestrator | 2025-09-27 01:20:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:50.253481 | orchestrator | 2025-09-27 01:20:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:50.255028 | orchestrator | 2025-09-27 01:20:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:50.255085 | orchestrator | 2025-09-27 01:20:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:53.298720 | orchestrator | 2025-09-27 01:20:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:53.300659 | orchestrator | 2025-09-27 01:20:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:53.301000 | orchestrator | 2025-09-27 01:20:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:56.346068 | orchestrator | 2025-09-27 01:20:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:56.348376 | orchestrator | 2025-09-27 01:20:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:56.348406 | orchestrator | 2025-09-27 01:20:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:20:59.389068 | orchestrator | 2025-09-27 01:20:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:20:59.390521 | orchestrator | 2025-09-27 01:20:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:20:59.390581 | orchestrator | 2025-09-27 01:20:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:02.431160 | orchestrator | 2025-09-27 01:21:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:02.432357 | orchestrator | 2025-09-27 01:21:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:02.432469 | orchestrator | 2025-09-27 01:21:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:05.481779 | orchestrator | 2025-09-27 01:21:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:05.483182 | orchestrator | 2025-09-27 01:21:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:05.483367 | orchestrator | 2025-09-27 01:21:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:08.526209 | orchestrator | 2025-09-27 01:21:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:08.528118 | orchestrator | 2025-09-27 01:21:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:08.528144 | orchestrator | 2025-09-27 01:21:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:11.568985 | orchestrator | 2025-09-27 01:21:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:11.569925 | orchestrator | 2025-09-27 01:21:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:11.569985 | orchestrator | 2025-09-27 01:21:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:14.615837 | orchestrator | 2025-09-27 01:21:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:14.616851 | orchestrator | 2025-09-27 01:21:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:14.616903 | orchestrator | 2025-09-27 01:21:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:17.666467 | orchestrator | 2025-09-27 01:21:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:17.667620 | orchestrator | 2025-09-27 01:21:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:17.667827 | orchestrator | 2025-09-27 01:21:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:20.708418 | orchestrator | 2025-09-27 01:21:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:20.709497 | orchestrator | 2025-09-27 01:21:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:20.709528 | orchestrator | 2025-09-27 01:21:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:23.763641 | orchestrator | 2025-09-27 01:21:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:23.764593 | orchestrator | 2025-09-27 01:21:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:23.764622 | orchestrator | 2025-09-27 01:21:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:26.811816 | orchestrator | 2025-09-27 01:21:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:26.813983 | orchestrator | 2025-09-27 01:21:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:26.814075 | orchestrator | 2025-09-27 01:21:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:29.859991 | orchestrator | 2025-09-27 01:21:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:29.861381 | orchestrator | 2025-09-27 01:21:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:29.861503 | orchestrator | 2025-09-27 01:21:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:32.908203 | orchestrator | 2025-09-27 01:21:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:32.910621 | orchestrator | 2025-09-27 01:21:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:32.910694 | orchestrator | 2025-09-27 01:21:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:35.960279 | orchestrator | 2025-09-27 01:21:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:35.961106 | orchestrator | 2025-09-27 01:21:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:35.961615 | orchestrator | 2025-09-27 01:21:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:39.014214 | orchestrator | 2025-09-27 01:21:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:39.015038 | orchestrator | 2025-09-27 01:21:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:39.015071 | orchestrator | 2025-09-27 01:21:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:42.067603 | orchestrator | 2025-09-27 01:21:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:42.069150 | orchestrator | 2025-09-27 01:21:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:42.069178 | orchestrator | 2025-09-27 01:21:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:45.119563 | orchestrator | 2025-09-27 01:21:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:45.120797 | orchestrator | 2025-09-27 01:21:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:45.121122 | orchestrator | 2025-09-27 01:21:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:48.173476 | orchestrator | 2025-09-27 01:21:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:48.176359 | orchestrator | 2025-09-27 01:21:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:48.176379 | orchestrator | 2025-09-27 01:21:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:51.229586 | orchestrator | 2025-09-27 01:21:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:51.231316 | orchestrator | 2025-09-27 01:21:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:51.231344 | orchestrator | 2025-09-27 01:21:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:54.288810 | orchestrator | 2025-09-27 01:21:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:54.291357 | orchestrator | 2025-09-27 01:21:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:54.291431 | orchestrator | 2025-09-27 01:21:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:21:57.337620 | orchestrator | 2025-09-27 01:21:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:21:57.340786 | orchestrator | 2025-09-27 01:21:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:21:57.340988 | orchestrator | 2025-09-27 01:21:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:00.382802 | orchestrator | 2025-09-27 01:22:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:00.383008 | orchestrator | 2025-09-27 01:22:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:00.383218 | orchestrator | 2025-09-27 01:22:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:03.430303 | orchestrator | 2025-09-27 01:22:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:03.431271 | orchestrator | 2025-09-27 01:22:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:03.431351 | orchestrator | 2025-09-27 01:22:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:06.482447 | orchestrator | 2025-09-27 01:22:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:06.482822 | orchestrator | 2025-09-27 01:22:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:06.482842 | orchestrator | 2025-09-27 01:22:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:09.524379 | orchestrator | 2025-09-27 01:22:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:09.525407 | orchestrator | 2025-09-27 01:22:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:09.525429 | orchestrator | 2025-09-27 01:22:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:12.572382 | orchestrator | 2025-09-27 01:22:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:12.574125 | orchestrator | 2025-09-27 01:22:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:12.574155 | orchestrator | 2025-09-27 01:22:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:15.623298 | orchestrator | 2025-09-27 01:22:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:15.624642 | orchestrator | 2025-09-27 01:22:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:15.624670 | orchestrator | 2025-09-27 01:22:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:18.669221 | orchestrator | 2025-09-27 01:22:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:18.670377 | orchestrator | 2025-09-27 01:22:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:18.670421 | orchestrator | 2025-09-27 01:22:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:21.711638 | orchestrator | 2025-09-27 01:22:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:21.713716 | orchestrator | 2025-09-27 01:22:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:21.713792 | orchestrator | 2025-09-27 01:22:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:24.758234 | orchestrator | 2025-09-27 01:22:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:24.760127 | orchestrator | 2025-09-27 01:22:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:24.760532 | orchestrator | 2025-09-27 01:22:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:27.809359 | orchestrator | 2025-09-27 01:22:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:27.811989 | orchestrator | 2025-09-27 01:22:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:27.812021 | orchestrator | 2025-09-27 01:22:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:30.850898 | orchestrator | 2025-09-27 01:22:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:30.852233 | orchestrator | 2025-09-27 01:22:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:30.852262 | orchestrator | 2025-09-27 01:22:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:33.896754 | orchestrator | 2025-09-27 01:22:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:33.898170 | orchestrator | 2025-09-27 01:22:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:33.898199 | orchestrator | 2025-09-27 01:22:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:36.948499 | orchestrator | 2025-09-27 01:22:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:36.950809 | orchestrator | 2025-09-27 01:22:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:36.950956 | orchestrator | 2025-09-27 01:22:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:40.005293 | orchestrator | 2025-09-27 01:22:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:40.007175 | orchestrator | 2025-09-27 01:22:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:40.007242 | orchestrator | 2025-09-27 01:22:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:43.061419 | orchestrator | 2025-09-27 01:22:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:43.063622 | orchestrator | 2025-09-27 01:22:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:43.064019 | orchestrator | 2025-09-27 01:22:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:46.111957 | orchestrator | 2025-09-27 01:22:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:46.113592 | orchestrator | 2025-09-27 01:22:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:46.113621 | orchestrator | 2025-09-27 01:22:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:49.167586 | orchestrator | 2025-09-27 01:22:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:49.169382 | orchestrator | 2025-09-27 01:22:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:49.169396 | orchestrator | 2025-09-27 01:22:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:52.214837 | orchestrator | 2025-09-27 01:22:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:52.216862 | orchestrator | 2025-09-27 01:22:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:52.216886 | orchestrator | 2025-09-27 01:22:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:55.266358 | orchestrator | 2025-09-27 01:22:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:55.269199 | orchestrator | 2025-09-27 01:22:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:55.269316 | orchestrator | 2025-09-27 01:22:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:22:58.314182 | orchestrator | 2025-09-27 01:22:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:22:58.315293 | orchestrator | 2025-09-27 01:22:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:22:58.315321 | orchestrator | 2025-09-27 01:22:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:01.364983 | orchestrator | 2025-09-27 01:23:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:01.366253 | orchestrator | 2025-09-27 01:23:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:01.366427 | orchestrator | 2025-09-27 01:23:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:04.411172 | orchestrator | 2025-09-27 01:23:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:04.412524 | orchestrator | 2025-09-27 01:23:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:04.412615 | orchestrator | 2025-09-27 01:23:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:07.453787 | orchestrator | 2025-09-27 01:23:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:07.455617 | orchestrator | 2025-09-27 01:23:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:07.455703 | orchestrator | 2025-09-27 01:23:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:10.503777 | orchestrator | 2025-09-27 01:23:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:10.508127 | orchestrator | 2025-09-27 01:23:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:10.508157 | orchestrator | 2025-09-27 01:23:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:13.559500 | orchestrator | 2025-09-27 01:23:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:13.561408 | orchestrator | 2025-09-27 01:23:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:13.561712 | orchestrator | 2025-09-27 01:23:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:16.604190 | orchestrator | 2025-09-27 01:23:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:16.605758 | orchestrator | 2025-09-27 01:23:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:16.605787 | orchestrator | 2025-09-27 01:23:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:19.651409 | orchestrator | 2025-09-27 01:23:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:19.652616 | orchestrator | 2025-09-27 01:23:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:19.652673 | orchestrator | 2025-09-27 01:23:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:22.701430 | orchestrator | 2025-09-27 01:23:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:22.704636 | orchestrator | 2025-09-27 01:23:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:22.704674 | orchestrator | 2025-09-27 01:23:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:25.746361 | orchestrator | 2025-09-27 01:23:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:25.748561 | orchestrator | 2025-09-27 01:23:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:25.748990 | orchestrator | 2025-09-27 01:23:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:28.797393 | orchestrator | 2025-09-27 01:23:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:28.798590 | orchestrator | 2025-09-27 01:23:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:28.798620 | orchestrator | 2025-09-27 01:23:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:31.842789 | orchestrator | 2025-09-27 01:23:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:31.843988 | orchestrator | 2025-09-27 01:23:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:31.844231 | orchestrator | 2025-09-27 01:23:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:34.898420 | orchestrator | 2025-09-27 01:23:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:34.900088 | orchestrator | 2025-09-27 01:23:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:34.900117 | orchestrator | 2025-09-27 01:23:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:37.944021 | orchestrator | 2025-09-27 01:23:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:37.945655 | orchestrator | 2025-09-27 01:23:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:37.945961 | orchestrator | 2025-09-27 01:23:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:40.991209 | orchestrator | 2025-09-27 01:23:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:40.992977 | orchestrator | 2025-09-27 01:23:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:40.993199 | orchestrator | 2025-09-27 01:23:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:44.040178 | orchestrator | 2025-09-27 01:23:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:44.041168 | orchestrator | 2025-09-27 01:23:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:44.041191 | orchestrator | 2025-09-27 01:23:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:47.086875 | orchestrator | 2025-09-27 01:23:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:47.088181 | orchestrator | 2025-09-27 01:23:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:47.088210 | orchestrator | 2025-09-27 01:23:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:50.135746 | orchestrator | 2025-09-27 01:23:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:50.136168 | orchestrator | 2025-09-27 01:23:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:50.136356 | orchestrator | 2025-09-27 01:23:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:53.184771 | orchestrator | 2025-09-27 01:23:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:53.186604 | orchestrator | 2025-09-27 01:23:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:53.186632 | orchestrator | 2025-09-27 01:23:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:56.225430 | orchestrator | 2025-09-27 01:23:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:56.226524 | orchestrator | 2025-09-27 01:23:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:56.226635 | orchestrator | 2025-09-27 01:23:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:23:59.276158 | orchestrator | 2025-09-27 01:23:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:23:59.276339 | orchestrator | 2025-09-27 01:23:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:23:59.276450 | orchestrator | 2025-09-27 01:23:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:02.323879 | orchestrator | 2025-09-27 01:24:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:02.326219 | orchestrator | 2025-09-27 01:24:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:02.326250 | orchestrator | 2025-09-27 01:24:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:05.371216 | orchestrator | 2025-09-27 01:24:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:05.373656 | orchestrator | 2025-09-27 01:24:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:05.373693 | orchestrator | 2025-09-27 01:24:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:08.419315 | orchestrator | 2025-09-27 01:24:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:08.423001 | orchestrator | 2025-09-27 01:24:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:08.423082 | orchestrator | 2025-09-27 01:24:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:11.465761 | orchestrator | 2025-09-27 01:24:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:11.467072 | orchestrator | 2025-09-27 01:24:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:11.467102 | orchestrator | 2025-09-27 01:24:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:14.514295 | orchestrator | 2025-09-27 01:24:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:14.516148 | orchestrator | 2025-09-27 01:24:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:14.516235 | orchestrator | 2025-09-27 01:24:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:17.565610 | orchestrator | 2025-09-27 01:24:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:17.567336 | orchestrator | 2025-09-27 01:24:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:17.567557 | orchestrator | 2025-09-27 01:24:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:20.613883 | orchestrator | 2025-09-27 01:24:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:20.615430 | orchestrator | 2025-09-27 01:24:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:20.615459 | orchestrator | 2025-09-27 01:24:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:23.669991 | orchestrator | 2025-09-27 01:24:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:23.671708 | orchestrator | 2025-09-27 01:24:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:23.671766 | orchestrator | 2025-09-27 01:24:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:26.720459 | orchestrator | 2025-09-27 01:24:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:26.722917 | orchestrator | 2025-09-27 01:24:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:26.722953 | orchestrator | 2025-09-27 01:24:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:29.777204 | orchestrator | 2025-09-27 01:24:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:29.777314 | orchestrator | 2025-09-27 01:24:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:29.777333 | orchestrator | 2025-09-27 01:24:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:32.823798 | orchestrator | 2025-09-27 01:24:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:32.825510 | orchestrator | 2025-09-27 01:24:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:32.826103 | orchestrator | 2025-09-27 01:24:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:35.868992 | orchestrator | 2025-09-27 01:24:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:35.870377 | orchestrator | 2025-09-27 01:24:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:35.870474 | orchestrator | 2025-09-27 01:24:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:38.915212 | orchestrator | 2025-09-27 01:24:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:38.916571 | orchestrator | 2025-09-27 01:24:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:38.916608 | orchestrator | 2025-09-27 01:24:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:41.964433 | orchestrator | 2025-09-27 01:24:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:41.966476 | orchestrator | 2025-09-27 01:24:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:41.966640 | orchestrator | 2025-09-27 01:24:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:45.023728 | orchestrator | 2025-09-27 01:24:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:45.024851 | orchestrator | 2025-09-27 01:24:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:45.024959 | orchestrator | 2025-09-27 01:24:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:48.078287 | orchestrator | 2025-09-27 01:24:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:48.080301 | orchestrator | 2025-09-27 01:24:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:48.080333 | orchestrator | 2025-09-27 01:24:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:51.132536 | orchestrator | 2025-09-27 01:24:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:51.134371 | orchestrator | 2025-09-27 01:24:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:51.134408 | orchestrator | 2025-09-27 01:24:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:54.191195 | orchestrator | 2025-09-27 01:24:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:54.191630 | orchestrator | 2025-09-27 01:24:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:54.191661 | orchestrator | 2025-09-27 01:24:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:24:57.239738 | orchestrator | 2025-09-27 01:24:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:24:57.242000 | orchestrator | 2025-09-27 01:24:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:24:57.242089 | orchestrator | 2025-09-27 01:24:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:00.291222 | orchestrator | 2025-09-27 01:25:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:00.291739 | orchestrator | 2025-09-27 01:25:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:00.291819 | orchestrator | 2025-09-27 01:25:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:03.351173 | orchestrator | 2025-09-27 01:25:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:03.355821 | orchestrator | 2025-09-27 01:25:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:03.356109 | orchestrator | 2025-09-27 01:25:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:06.421183 | orchestrator | 2025-09-27 01:25:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:06.422084 | orchestrator | 2025-09-27 01:25:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:06.422118 | orchestrator | 2025-09-27 01:25:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:09.473482 | orchestrator | 2025-09-27 01:25:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:09.478129 | orchestrator | 2025-09-27 01:25:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:09.478172 | orchestrator | 2025-09-27 01:25:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:12.522261 | orchestrator | 2025-09-27 01:25:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:12.524202 | orchestrator | 2025-09-27 01:25:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:12.524357 | orchestrator | 2025-09-27 01:25:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:15.567302 | orchestrator | 2025-09-27 01:25:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:15.569581 | orchestrator | 2025-09-27 01:25:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:15.569601 | orchestrator | 2025-09-27 01:25:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:18.625001 | orchestrator | 2025-09-27 01:25:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:18.626116 | orchestrator | 2025-09-27 01:25:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:18.626148 | orchestrator | 2025-09-27 01:25:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:21.677551 | orchestrator | 2025-09-27 01:25:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:21.679501 | orchestrator | 2025-09-27 01:25:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:21.679537 | orchestrator | 2025-09-27 01:25:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:24.727129 | orchestrator | 2025-09-27 01:25:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:24.728669 | orchestrator | 2025-09-27 01:25:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:24.728718 | orchestrator | 2025-09-27 01:25:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:27.772273 | orchestrator | 2025-09-27 01:25:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:27.774271 | orchestrator | 2025-09-27 01:25:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:27.774390 | orchestrator | 2025-09-27 01:25:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:30.815738 | orchestrator | 2025-09-27 01:25:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:30.816015 | orchestrator | 2025-09-27 01:25:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:30.816144 | orchestrator | 2025-09-27 01:25:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:33.870199 | orchestrator | 2025-09-27 01:25:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:33.871442 | orchestrator | 2025-09-27 01:25:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:33.871473 | orchestrator | 2025-09-27 01:25:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:36.915026 | orchestrator | 2025-09-27 01:25:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:36.916898 | orchestrator | 2025-09-27 01:25:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:36.916985 | orchestrator | 2025-09-27 01:25:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:39.968115 | orchestrator | 2025-09-27 01:25:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:39.969507 | orchestrator | 2025-09-27 01:25:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:39.969535 | orchestrator | 2025-09-27 01:25:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:43.020985 | orchestrator | 2025-09-27 01:25:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:43.023026 | orchestrator | 2025-09-27 01:25:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:43.023060 | orchestrator | 2025-09-27 01:25:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:46.067538 | orchestrator | 2025-09-27 01:25:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:46.069047 | orchestrator | 2025-09-27 01:25:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:46.069251 | orchestrator | 2025-09-27 01:25:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:49.109373 | orchestrator | 2025-09-27 01:25:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:49.110235 | orchestrator | 2025-09-27 01:25:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:49.110271 | orchestrator | 2025-09-27 01:25:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:52.156415 | orchestrator | 2025-09-27 01:25:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:52.158264 | orchestrator | 2025-09-27 01:25:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:52.158416 | orchestrator | 2025-09-27 01:25:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:55.210442 | orchestrator | 2025-09-27 01:25:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:55.211575 | orchestrator | 2025-09-27 01:25:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:55.211609 | orchestrator | 2025-09-27 01:25:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:25:58.263051 | orchestrator | 2025-09-27 01:25:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:25:58.265026 | orchestrator | 2025-09-27 01:25:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:25:58.265064 | orchestrator | 2025-09-27 01:25:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:01.306568 | orchestrator | 2025-09-27 01:26:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:01.307419 | orchestrator | 2025-09-27 01:26:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:01.307447 | orchestrator | 2025-09-27 01:26:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:04.350570 | orchestrator | 2025-09-27 01:26:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:04.351356 | orchestrator | 2025-09-27 01:26:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:04.351481 | orchestrator | 2025-09-27 01:26:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:07.395964 | orchestrator | 2025-09-27 01:26:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:07.397385 | orchestrator | 2025-09-27 01:26:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:07.397417 | orchestrator | 2025-09-27 01:26:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:10.440928 | orchestrator | 2025-09-27 01:26:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:10.442278 | orchestrator | 2025-09-27 01:26:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:10.442334 | orchestrator | 2025-09-27 01:26:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:13.486100 | orchestrator | 2025-09-27 01:26:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:13.486330 | orchestrator | 2025-09-27 01:26:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:13.486434 | orchestrator | 2025-09-27 01:26:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:16.533741 | orchestrator | 2025-09-27 01:26:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:16.535543 | orchestrator | 2025-09-27 01:26:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:16.535631 | orchestrator | 2025-09-27 01:26:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:19.585074 | orchestrator | 2025-09-27 01:26:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:19.586961 | orchestrator | 2025-09-27 01:26:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:19.587046 | orchestrator | 2025-09-27 01:26:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:22.632395 | orchestrator | 2025-09-27 01:26:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:22.633575 | orchestrator | 2025-09-27 01:26:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:22.633623 | orchestrator | 2025-09-27 01:26:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:25.676276 | orchestrator | 2025-09-27 01:26:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:25.679279 | orchestrator | 2025-09-27 01:26:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:25.679313 | orchestrator | 2025-09-27 01:26:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:28.721004 | orchestrator | 2025-09-27 01:26:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:28.722168 | orchestrator | 2025-09-27 01:26:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:28.722202 | orchestrator | 2025-09-27 01:26:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:31.769233 | orchestrator | 2025-09-27 01:26:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:31.771195 | orchestrator | 2025-09-27 01:26:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:31.771247 | orchestrator | 2025-09-27 01:26:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:34.820389 | orchestrator | 2025-09-27 01:26:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:34.822687 | orchestrator | 2025-09-27 01:26:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:34.822720 | orchestrator | 2025-09-27 01:26:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:37.865445 | orchestrator | 2025-09-27 01:26:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:37.867333 | orchestrator | 2025-09-27 01:26:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:37.867379 | orchestrator | 2025-09-27 01:26:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:40.918519 | orchestrator | 2025-09-27 01:26:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:40.920291 | orchestrator | 2025-09-27 01:26:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:40.920324 | orchestrator | 2025-09-27 01:26:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:43.965271 | orchestrator | 2025-09-27 01:26:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:43.966530 | orchestrator | 2025-09-27 01:26:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:43.967165 | orchestrator | 2025-09-27 01:26:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:47.020071 | orchestrator | 2025-09-27 01:26:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:47.022522 | orchestrator | 2025-09-27 01:26:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:47.022812 | orchestrator | 2025-09-27 01:26:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:50.066563 | orchestrator | 2025-09-27 01:26:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:50.068159 | orchestrator | 2025-09-27 01:26:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:50.068221 | orchestrator | 2025-09-27 01:26:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:53.116525 | orchestrator | 2025-09-27 01:26:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:53.118009 | orchestrator | 2025-09-27 01:26:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:53.118273 | orchestrator | 2025-09-27 01:26:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:56.155541 | orchestrator | 2025-09-27 01:26:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:56.157570 | orchestrator | 2025-09-27 01:26:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:56.158132 | orchestrator | 2025-09-27 01:26:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:26:59.205964 | orchestrator | 2025-09-27 01:26:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:26:59.207404 | orchestrator | 2025-09-27 01:26:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:26:59.207489 | orchestrator | 2025-09-27 01:26:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:02.260625 | orchestrator | 2025-09-27 01:27:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:02.261520 | orchestrator | 2025-09-27 01:27:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:02.261576 | orchestrator | 2025-09-27 01:27:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:05.306359 | orchestrator | 2025-09-27 01:27:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:05.306966 | orchestrator | 2025-09-27 01:27:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:05.307001 | orchestrator | 2025-09-27 01:27:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:08.352092 | orchestrator | 2025-09-27 01:27:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:08.353440 | orchestrator | 2025-09-27 01:27:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:08.353511 | orchestrator | 2025-09-27 01:27:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:11.395543 | orchestrator | 2025-09-27 01:27:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:11.397386 | orchestrator | 2025-09-27 01:27:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:11.397473 | orchestrator | 2025-09-27 01:27:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:14.437818 | orchestrator | 2025-09-27 01:27:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:14.439044 | orchestrator | 2025-09-27 01:27:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:14.439133 | orchestrator | 2025-09-27 01:27:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:17.488338 | orchestrator | 2025-09-27 01:27:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:17.489827 | orchestrator | 2025-09-27 01:27:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:17.489999 | orchestrator | 2025-09-27 01:27:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:20.543187 | orchestrator | 2025-09-27 01:27:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:20.545033 | orchestrator | 2025-09-27 01:27:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:20.545102 | orchestrator | 2025-09-27 01:27:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:23.589516 | orchestrator | 2025-09-27 01:27:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:23.591159 | orchestrator | 2025-09-27 01:27:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:23.591235 | orchestrator | 2025-09-27 01:27:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:26.639586 | orchestrator | 2025-09-27 01:27:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:26.640878 | orchestrator | 2025-09-27 01:27:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:26.640983 | orchestrator | 2025-09-27 01:27:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:29.689169 | orchestrator | 2025-09-27 01:27:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:29.690835 | orchestrator | 2025-09-27 01:27:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:29.691105 | orchestrator | 2025-09-27 01:27:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:32.739905 | orchestrator | 2025-09-27 01:27:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:32.740912 | orchestrator | 2025-09-27 01:27:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:32.740944 | orchestrator | 2025-09-27 01:27:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:35.788908 | orchestrator | 2025-09-27 01:27:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:35.790469 | orchestrator | 2025-09-27 01:27:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:35.790550 | orchestrator | 2025-09-27 01:27:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:38.838784 | orchestrator | 2025-09-27 01:27:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:38.840080 | orchestrator | 2025-09-27 01:27:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:38.840191 | orchestrator | 2025-09-27 01:27:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:41.892793 | orchestrator | 2025-09-27 01:27:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:41.894193 | orchestrator | 2025-09-27 01:27:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:41.894696 | orchestrator | 2025-09-27 01:27:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:44.936544 | orchestrator | 2025-09-27 01:27:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:44.937726 | orchestrator | 2025-09-27 01:27:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:44.937758 | orchestrator | 2025-09-27 01:27:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:47.985454 | orchestrator | 2025-09-27 01:27:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:47.987389 | orchestrator | 2025-09-27 01:27:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:47.987423 | orchestrator | 2025-09-27 01:27:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:51.036557 | orchestrator | 2025-09-27 01:27:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:51.037940 | orchestrator | 2025-09-27 01:27:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:51.038144 | orchestrator | 2025-09-27 01:27:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:54.088238 | orchestrator | 2025-09-27 01:27:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:54.088656 | orchestrator | 2025-09-27 01:27:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:54.088685 | orchestrator | 2025-09-27 01:27:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:27:57.134232 | orchestrator | 2025-09-27 01:27:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:27:57.136361 | orchestrator | 2025-09-27 01:27:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:27:57.136400 | orchestrator | 2025-09-27 01:27:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:00.181464 | orchestrator | 2025-09-27 01:28:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:00.183441 | orchestrator | 2025-09-27 01:28:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:00.183476 | orchestrator | 2025-09-27 01:28:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:03.219178 | orchestrator | 2025-09-27 01:28:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:03.220944 | orchestrator | 2025-09-27 01:28:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:03.220973 | orchestrator | 2025-09-27 01:28:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:06.274308 | orchestrator | 2025-09-27 01:28:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:06.275916 | orchestrator | 2025-09-27 01:28:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:06.276352 | orchestrator | 2025-09-27 01:28:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:09.322338 | orchestrator | 2025-09-27 01:28:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:09.323961 | orchestrator | 2025-09-27 01:28:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:09.323994 | orchestrator | 2025-09-27 01:28:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:12.371619 | orchestrator | 2025-09-27 01:28:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:12.375581 | orchestrator | 2025-09-27 01:28:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:12.375835 | orchestrator | 2025-09-27 01:28:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:15.424244 | orchestrator | 2025-09-27 01:28:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:15.426107 | orchestrator | 2025-09-27 01:28:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:15.426363 | orchestrator | 2025-09-27 01:28:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:18.479105 | orchestrator | 2025-09-27 01:28:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:18.480490 | orchestrator | 2025-09-27 01:28:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:18.480498 | orchestrator | 2025-09-27 01:28:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:21.528506 | orchestrator | 2025-09-27 01:28:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:21.529772 | orchestrator | 2025-09-27 01:28:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:21.529804 | orchestrator | 2025-09-27 01:28:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:24.578954 | orchestrator | 2025-09-27 01:28:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:24.580475 | orchestrator | 2025-09-27 01:28:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:24.580517 | orchestrator | 2025-09-27 01:28:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:27.624800 | orchestrator | 2025-09-27 01:28:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:27.627027 | orchestrator | 2025-09-27 01:28:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:27.627070 | orchestrator | 2025-09-27 01:28:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:30.667243 | orchestrator | 2025-09-27 01:28:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:30.668703 | orchestrator | 2025-09-27 01:28:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:30.668735 | orchestrator | 2025-09-27 01:28:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:33.713176 | orchestrator | 2025-09-27 01:28:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:33.715532 | orchestrator | 2025-09-27 01:28:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:33.715571 | orchestrator | 2025-09-27 01:28:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:36.762571 | orchestrator | 2025-09-27 01:28:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:36.764971 | orchestrator | 2025-09-27 01:28:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:36.765017 | orchestrator | 2025-09-27 01:28:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:39.812735 | orchestrator | 2025-09-27 01:28:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:39.813414 | orchestrator | 2025-09-27 01:28:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:39.813655 | orchestrator | 2025-09-27 01:28:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:42.862284 | orchestrator | 2025-09-27 01:28:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:42.863929 | orchestrator | 2025-09-27 01:28:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:42.864028 | orchestrator | 2025-09-27 01:28:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:45.911129 | orchestrator | 2025-09-27 01:28:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:45.912607 | orchestrator | 2025-09-27 01:28:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:45.912692 | orchestrator | 2025-09-27 01:28:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:48.960733 | orchestrator | 2025-09-27 01:28:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:48.962683 | orchestrator | 2025-09-27 01:28:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:48.963419 | orchestrator | 2025-09-27 01:28:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:52.014986 | orchestrator | 2025-09-27 01:28:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:52.016767 | orchestrator | 2025-09-27 01:28:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:52.016881 | orchestrator | 2025-09-27 01:28:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:55.054677 | orchestrator | 2025-09-27 01:28:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:55.055211 | orchestrator | 2025-09-27 01:28:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:55.055300 | orchestrator | 2025-09-27 01:28:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:28:58.101289 | orchestrator | 2025-09-27 01:28:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:28:58.103510 | orchestrator | 2025-09-27 01:28:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:28:58.103591 | orchestrator | 2025-09-27 01:28:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:01.141050 | orchestrator | 2025-09-27 01:29:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:01.142634 | orchestrator | 2025-09-27 01:29:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:01.142671 | orchestrator | 2025-09-27 01:29:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:04.188937 | orchestrator | 2025-09-27 01:29:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:04.190428 | orchestrator | 2025-09-27 01:29:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:04.191102 | orchestrator | 2025-09-27 01:29:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:07.235456 | orchestrator | 2025-09-27 01:29:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:07.236545 | orchestrator | 2025-09-27 01:29:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:07.237068 | orchestrator | 2025-09-27 01:29:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:10.282664 | orchestrator | 2025-09-27 01:29:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:10.283699 | orchestrator | 2025-09-27 01:29:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:10.283836 | orchestrator | 2025-09-27 01:29:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:13.329586 | orchestrator | 2025-09-27 01:29:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:13.331029 | orchestrator | 2025-09-27 01:29:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:13.331076 | orchestrator | 2025-09-27 01:29:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:16.375031 | orchestrator | 2025-09-27 01:29:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:16.376338 | orchestrator | 2025-09-27 01:29:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:16.376508 | orchestrator | 2025-09-27 01:29:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:19.425306 | orchestrator | 2025-09-27 01:29:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:19.426703 | orchestrator | 2025-09-27 01:29:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:19.426740 | orchestrator | 2025-09-27 01:29:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:22.474131 | orchestrator | 2025-09-27 01:29:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:22.475142 | orchestrator | 2025-09-27 01:29:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:22.475177 | orchestrator | 2025-09-27 01:29:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:25.523810 | orchestrator | 2025-09-27 01:29:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:25.525087 | orchestrator | 2025-09-27 01:29:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:25.525120 | orchestrator | 2025-09-27 01:29:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:28.569452 | orchestrator | 2025-09-27 01:29:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:28.570343 | orchestrator | 2025-09-27 01:29:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:28.570371 | orchestrator | 2025-09-27 01:29:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:31.614519 | orchestrator | 2025-09-27 01:29:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:31.616610 | orchestrator | 2025-09-27 01:29:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:31.616731 | orchestrator | 2025-09-27 01:29:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:34.654295 | orchestrator | 2025-09-27 01:29:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:34.655562 | orchestrator | 2025-09-27 01:29:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:34.655749 | orchestrator | 2025-09-27 01:29:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:37.702331 | orchestrator | 2025-09-27 01:29:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:37.704805 | orchestrator | 2025-09-27 01:29:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:37.704916 | orchestrator | 2025-09-27 01:29:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:40.753609 | orchestrator | 2025-09-27 01:29:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:40.755442 | orchestrator | 2025-09-27 01:29:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:40.755477 | orchestrator | 2025-09-27 01:29:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:43.799895 | orchestrator | 2025-09-27 01:29:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:43.800426 | orchestrator | 2025-09-27 01:29:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:43.800461 | orchestrator | 2025-09-27 01:29:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:46.849195 | orchestrator | 2025-09-27 01:29:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:46.850367 | orchestrator | 2025-09-27 01:29:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:46.850500 | orchestrator | 2025-09-27 01:29:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:49.897366 | orchestrator | 2025-09-27 01:29:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:49.898596 | orchestrator | 2025-09-27 01:29:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:49.898669 | orchestrator | 2025-09-27 01:29:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:52.946544 | orchestrator | 2025-09-27 01:29:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:52.947703 | orchestrator | 2025-09-27 01:29:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:52.947750 | orchestrator | 2025-09-27 01:29:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:55.989991 | orchestrator | 2025-09-27 01:29:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:55.991236 | orchestrator | 2025-09-27 01:29:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:55.991267 | orchestrator | 2025-09-27 01:29:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:29:59.029039 | orchestrator | 2025-09-27 01:29:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:29:59.030171 | orchestrator | 2025-09-27 01:29:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:29:59.030223 | orchestrator | 2025-09-27 01:29:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:02.069633 | orchestrator | 2025-09-27 01:30:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:02.070441 | orchestrator | 2025-09-27 01:30:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:02.070478 | orchestrator | 2025-09-27 01:30:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:05.114360 | orchestrator | 2025-09-27 01:30:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:05.116469 | orchestrator | 2025-09-27 01:30:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:05.116546 | orchestrator | 2025-09-27 01:30:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:08.157332 | orchestrator | 2025-09-27 01:30:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:08.158995 | orchestrator | 2025-09-27 01:30:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:08.159089 | orchestrator | 2025-09-27 01:30:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:11.198840 | orchestrator | 2025-09-27 01:30:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:11.200353 | orchestrator | 2025-09-27 01:30:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:11.200383 | orchestrator | 2025-09-27 01:30:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:14.247141 | orchestrator | 2025-09-27 01:30:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:14.250182 | orchestrator | 2025-09-27 01:30:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:14.250229 | orchestrator | 2025-09-27 01:30:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:17.291387 | orchestrator | 2025-09-27 01:30:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:17.293055 | orchestrator | 2025-09-27 01:30:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:17.293149 | orchestrator | 2025-09-27 01:30:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:20.336565 | orchestrator | 2025-09-27 01:30:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:20.337982 | orchestrator | 2025-09-27 01:30:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:20.338061 | orchestrator | 2025-09-27 01:30:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:23.381087 | orchestrator | 2025-09-27 01:30:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:23.383226 | orchestrator | 2025-09-27 01:30:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:23.383361 | orchestrator | 2025-09-27 01:30:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:26.426166 | orchestrator | 2025-09-27 01:30:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:26.428349 | orchestrator | 2025-09-27 01:30:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:26.428755 | orchestrator | 2025-09-27 01:30:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:29.472842 | orchestrator | 2025-09-27 01:30:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:29.474367 | orchestrator | 2025-09-27 01:30:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:29.474393 | orchestrator | 2025-09-27 01:30:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:32.514295 | orchestrator | 2025-09-27 01:30:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:32.516454 | orchestrator | 2025-09-27 01:30:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:32.516483 | orchestrator | 2025-09-27 01:30:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:35.562901 | orchestrator | 2025-09-27 01:30:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:35.563555 | orchestrator | 2025-09-27 01:30:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:35.563588 | orchestrator | 2025-09-27 01:30:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:38.611206 | orchestrator | 2025-09-27 01:30:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:38.612882 | orchestrator | 2025-09-27 01:30:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:38.612914 | orchestrator | 2025-09-27 01:30:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:41.654638 | orchestrator | 2025-09-27 01:30:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:41.656065 | orchestrator | 2025-09-27 01:30:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:41.656103 | orchestrator | 2025-09-27 01:30:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:44.696944 | orchestrator | 2025-09-27 01:30:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:44.699074 | orchestrator | 2025-09-27 01:30:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:44.699450 | orchestrator | 2025-09-27 01:30:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:47.743509 | orchestrator | 2025-09-27 01:30:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:47.747071 | orchestrator | 2025-09-27 01:30:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:47.747210 | orchestrator | 2025-09-27 01:30:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:50.795323 | orchestrator | 2025-09-27 01:30:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:50.798268 | orchestrator | 2025-09-27 01:30:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:50.798412 | orchestrator | 2025-09-27 01:30:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:53.847114 | orchestrator | 2025-09-27 01:30:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:53.848892 | orchestrator | 2025-09-27 01:30:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:53.848922 | orchestrator | 2025-09-27 01:30:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:56.885308 | orchestrator | 2025-09-27 01:30:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:56.886159 | orchestrator | 2025-09-27 01:30:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:56.886192 | orchestrator | 2025-09-27 01:30:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:30:59.928796 | orchestrator | 2025-09-27 01:30:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:30:59.929037 | orchestrator | 2025-09-27 01:30:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:30:59.929059 | orchestrator | 2025-09-27 01:30:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:02.964468 | orchestrator | 2025-09-27 01:31:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:02.966063 | orchestrator | 2025-09-27 01:31:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:02.966090 | orchestrator | 2025-09-27 01:31:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:06.007257 | orchestrator | 2025-09-27 01:31:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:06.008691 | orchestrator | 2025-09-27 01:31:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:06.008729 | orchestrator | 2025-09-27 01:31:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:09.055023 | orchestrator | 2025-09-27 01:31:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:09.055127 | orchestrator | 2025-09-27 01:31:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:09.055142 | orchestrator | 2025-09-27 01:31:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:12.096346 | orchestrator | 2025-09-27 01:31:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:12.097582 | orchestrator | 2025-09-27 01:31:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:12.097671 | orchestrator | 2025-09-27 01:31:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:15.144511 | orchestrator | 2025-09-27 01:31:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:15.145241 | orchestrator | 2025-09-27 01:31:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:15.145325 | orchestrator | 2025-09-27 01:31:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:18.187511 | orchestrator | 2025-09-27 01:31:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:18.188404 | orchestrator | 2025-09-27 01:31:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:18.188434 | orchestrator | 2025-09-27 01:31:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:21.225686 | orchestrator | 2025-09-27 01:31:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:21.227516 | orchestrator | 2025-09-27 01:31:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:21.227560 | orchestrator | 2025-09-27 01:31:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:24.271912 | orchestrator | 2025-09-27 01:31:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:24.272004 | orchestrator | 2025-09-27 01:31:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:24.272018 | orchestrator | 2025-09-27 01:31:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:27.314469 | orchestrator | 2025-09-27 01:31:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:27.315696 | orchestrator | 2025-09-27 01:31:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:27.315727 | orchestrator | 2025-09-27 01:31:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:30.356880 | orchestrator | 2025-09-27 01:31:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:30.358991 | orchestrator | 2025-09-27 01:31:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:30.359019 | orchestrator | 2025-09-27 01:31:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:33.396146 | orchestrator | 2025-09-27 01:31:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:33.396699 | orchestrator | 2025-09-27 01:31:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:33.396955 | orchestrator | 2025-09-27 01:31:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:36.435422 | orchestrator | 2025-09-27 01:31:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:36.437549 | orchestrator | 2025-09-27 01:31:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:36.437594 | orchestrator | 2025-09-27 01:31:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:39.485399 | orchestrator | 2025-09-27 01:31:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:39.486169 | orchestrator | 2025-09-27 01:31:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:39.486196 | orchestrator | 2025-09-27 01:31:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:42.552664 | orchestrator | 2025-09-27 01:31:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:42.554005 | orchestrator | 2025-09-27 01:31:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:42.554113 | orchestrator | 2025-09-27 01:31:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:45.619219 | orchestrator | 2025-09-27 01:31:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:45.620470 | orchestrator | 2025-09-27 01:31:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:45.620501 | orchestrator | 2025-09-27 01:31:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:48.669040 | orchestrator | 2025-09-27 01:31:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:48.671062 | orchestrator | 2025-09-27 01:31:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:48.671160 | orchestrator | 2025-09-27 01:31:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:51.712781 | orchestrator | 2025-09-27 01:31:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:51.714791 | orchestrator | 2025-09-27 01:31:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:51.714861 | orchestrator | 2025-09-27 01:31:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:54.763002 | orchestrator | 2025-09-27 01:31:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:54.764275 | orchestrator | 2025-09-27 01:31:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:54.764307 | orchestrator | 2025-09-27 01:31:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:31:57.836604 | orchestrator | 2025-09-27 01:31:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:31:57.837620 | orchestrator | 2025-09-27 01:31:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:31:57.837651 | orchestrator | 2025-09-27 01:31:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:00.881884 | orchestrator | 2025-09-27 01:32:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:00.883627 | orchestrator | 2025-09-27 01:32:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:00.883671 | orchestrator | 2025-09-27 01:32:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:03.929238 | orchestrator | 2025-09-27 01:32:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:03.930900 | orchestrator | 2025-09-27 01:32:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:03.931235 | orchestrator | 2025-09-27 01:32:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:06.974928 | orchestrator | 2025-09-27 01:32:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:06.976163 | orchestrator | 2025-09-27 01:32:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:06.976200 | orchestrator | 2025-09-27 01:32:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:10.015708 | orchestrator | 2025-09-27 01:32:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:10.016908 | orchestrator | 2025-09-27 01:32:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:10.016951 | orchestrator | 2025-09-27 01:32:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:13.069147 | orchestrator | 2025-09-27 01:32:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:13.069632 | orchestrator | 2025-09-27 01:32:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:13.070162 | orchestrator | 2025-09-27 01:32:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:16.124250 | orchestrator | 2025-09-27 01:32:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:16.125559 | orchestrator | 2025-09-27 01:32:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:16.125697 | orchestrator | 2025-09-27 01:32:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:19.175774 | orchestrator | 2025-09-27 01:32:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:19.177609 | orchestrator | 2025-09-27 01:32:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:19.177651 | orchestrator | 2025-09-27 01:32:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:22.230289 | orchestrator | 2025-09-27 01:32:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:22.231518 | orchestrator | 2025-09-27 01:32:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:22.231783 | orchestrator | 2025-09-27 01:32:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:25.282533 | orchestrator | 2025-09-27 01:32:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:25.285374 | orchestrator | 2025-09-27 01:32:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:25.285407 | orchestrator | 2025-09-27 01:32:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:28.335759 | orchestrator | 2025-09-27 01:32:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:28.337061 | orchestrator | 2025-09-27 01:32:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:28.337214 | orchestrator | 2025-09-27 01:32:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:31.385542 | orchestrator | 2025-09-27 01:32:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:31.387586 | orchestrator | 2025-09-27 01:32:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:31.387618 | orchestrator | 2025-09-27 01:32:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:34.434590 | orchestrator | 2025-09-27 01:32:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:34.435975 | orchestrator | 2025-09-27 01:32:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:34.436013 | orchestrator | 2025-09-27 01:32:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:37.482743 | orchestrator | 2025-09-27 01:32:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:37.484369 | orchestrator | 2025-09-27 01:32:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:37.484401 | orchestrator | 2025-09-27 01:32:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:40.530893 | orchestrator | 2025-09-27 01:32:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:40.532674 | orchestrator | 2025-09-27 01:32:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:40.532714 | orchestrator | 2025-09-27 01:32:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:43.577312 | orchestrator | 2025-09-27 01:32:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:43.578550 | orchestrator | 2025-09-27 01:32:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:43.578584 | orchestrator | 2025-09-27 01:32:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:46.627786 | orchestrator | 2025-09-27 01:32:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:46.629420 | orchestrator | 2025-09-27 01:32:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:46.629542 | orchestrator | 2025-09-27 01:32:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:49.668270 | orchestrator | 2025-09-27 01:32:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:49.670372 | orchestrator | 2025-09-27 01:32:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:49.670460 | orchestrator | 2025-09-27 01:32:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:52.712607 | orchestrator | 2025-09-27 01:32:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:52.715265 | orchestrator | 2025-09-27 01:32:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:52.715615 | orchestrator | 2025-09-27 01:32:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:55.764231 | orchestrator | 2025-09-27 01:32:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:55.765674 | orchestrator | 2025-09-27 01:32:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:55.765745 | orchestrator | 2025-09-27 01:32:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:32:58.811771 | orchestrator | 2025-09-27 01:32:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:32:58.814620 | orchestrator | 2025-09-27 01:32:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:32:58.814651 | orchestrator | 2025-09-27 01:32:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:01.863438 | orchestrator | 2025-09-27 01:33:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:01.865275 | orchestrator | 2025-09-27 01:33:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:01.865307 | orchestrator | 2025-09-27 01:33:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:04.914784 | orchestrator | 2025-09-27 01:33:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:04.916830 | orchestrator | 2025-09-27 01:33:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:04.916930 | orchestrator | 2025-09-27 01:33:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:07.963142 | orchestrator | 2025-09-27 01:33:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:07.965955 | orchestrator | 2025-09-27 01:33:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:07.966295 | orchestrator | 2025-09-27 01:33:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:11.010541 | orchestrator | 2025-09-27 01:33:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:11.012027 | orchestrator | 2025-09-27 01:33:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:11.012332 | orchestrator | 2025-09-27 01:33:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:14.056953 | orchestrator | 2025-09-27 01:33:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:14.058989 | orchestrator | 2025-09-27 01:33:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:14.059129 | orchestrator | 2025-09-27 01:33:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:17.105329 | orchestrator | 2025-09-27 01:33:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:17.106793 | orchestrator | 2025-09-27 01:33:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:17.106825 | orchestrator | 2025-09-27 01:33:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:20.157789 | orchestrator | 2025-09-27 01:33:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:20.158572 | orchestrator | 2025-09-27 01:33:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:20.158631 | orchestrator | 2025-09-27 01:33:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:23.210277 | orchestrator | 2025-09-27 01:33:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:23.212409 | orchestrator | 2025-09-27 01:33:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:23.212461 | orchestrator | 2025-09-27 01:33:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:26.257497 | orchestrator | 2025-09-27 01:33:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:26.259685 | orchestrator | 2025-09-27 01:33:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:26.259722 | orchestrator | 2025-09-27 01:33:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:29.303925 | orchestrator | 2025-09-27 01:33:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:29.306473 | orchestrator | 2025-09-27 01:33:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:29.306519 | orchestrator | 2025-09-27 01:33:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:32.348923 | orchestrator | 2025-09-27 01:33:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:32.350888 | orchestrator | 2025-09-27 01:33:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:32.353158 | orchestrator | 2025-09-27 01:33:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:35.394408 | orchestrator | 2025-09-27 01:33:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:35.395487 | orchestrator | 2025-09-27 01:33:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:35.395529 | orchestrator | 2025-09-27 01:33:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:38.447545 | orchestrator | 2025-09-27 01:33:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:38.449612 | orchestrator | 2025-09-27 01:33:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:38.449641 | orchestrator | 2025-09-27 01:33:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:41.499988 | orchestrator | 2025-09-27 01:33:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:41.501385 | orchestrator | 2025-09-27 01:33:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:41.501416 | orchestrator | 2025-09-27 01:33:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:44.551105 | orchestrator | 2025-09-27 01:33:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:44.552854 | orchestrator | 2025-09-27 01:33:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:44.552891 | orchestrator | 2025-09-27 01:33:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:47.601374 | orchestrator | 2025-09-27 01:33:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:47.603094 | orchestrator | 2025-09-27 01:33:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:47.603245 | orchestrator | 2025-09-27 01:33:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:50.648946 | orchestrator | 2025-09-27 01:33:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:50.650118 | orchestrator | 2025-09-27 01:33:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:50.650248 | orchestrator | 2025-09-27 01:33:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:53.690311 | orchestrator | 2025-09-27 01:33:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:53.692078 | orchestrator | 2025-09-27 01:33:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:53.692201 | orchestrator | 2025-09-27 01:33:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:56.733359 | orchestrator | 2025-09-27 01:33:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:56.736298 | orchestrator | 2025-09-27 01:33:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:56.736424 | orchestrator | 2025-09-27 01:33:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:33:59.780137 | orchestrator | 2025-09-27 01:33:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:33:59.781649 | orchestrator | 2025-09-27 01:33:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:33:59.781916 | orchestrator | 2025-09-27 01:33:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:02.816116 | orchestrator | 2025-09-27 01:34:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:02.817504 | orchestrator | 2025-09-27 01:34:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:02.817535 | orchestrator | 2025-09-27 01:34:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:05.861286 | orchestrator | 2025-09-27 01:34:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:05.862306 | orchestrator | 2025-09-27 01:34:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:05.862615 | orchestrator | 2025-09-27 01:34:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:08.904090 | orchestrator | 2025-09-27 01:34:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:08.905574 | orchestrator | 2025-09-27 01:34:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:08.905615 | orchestrator | 2025-09-27 01:34:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:11.945272 | orchestrator | 2025-09-27 01:34:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:11.947693 | orchestrator | 2025-09-27 01:34:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:11.947775 | orchestrator | 2025-09-27 01:34:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:14.987070 | orchestrator | 2025-09-27 01:34:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:14.988267 | orchestrator | 2025-09-27 01:34:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:14.988481 | orchestrator | 2025-09-27 01:34:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:18.030136 | orchestrator | 2025-09-27 01:34:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:18.033694 | orchestrator | 2025-09-27 01:34:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:18.033747 | orchestrator | 2025-09-27 01:34:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:21.071603 | orchestrator | 2025-09-27 01:34:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:21.073018 | orchestrator | 2025-09-27 01:34:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:21.073123 | orchestrator | 2025-09-27 01:34:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:24.113460 | orchestrator | 2025-09-27 01:34:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:24.114265 | orchestrator | 2025-09-27 01:34:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:24.114297 | orchestrator | 2025-09-27 01:34:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:27.158306 | orchestrator | 2025-09-27 01:34:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:27.159463 | orchestrator | 2025-09-27 01:34:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:27.159580 | orchestrator | 2025-09-27 01:34:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:30.201715 | orchestrator | 2025-09-27 01:34:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:30.203467 | orchestrator | 2025-09-27 01:34:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:30.203504 | orchestrator | 2025-09-27 01:34:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:33.256571 | orchestrator | 2025-09-27 01:34:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:33.257911 | orchestrator | 2025-09-27 01:34:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:33.258135 | orchestrator | 2025-09-27 01:34:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:36.309387 | orchestrator | 2025-09-27 01:34:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:36.310954 | orchestrator | 2025-09-27 01:34:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:36.310990 | orchestrator | 2025-09-27 01:34:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:39.354412 | orchestrator | 2025-09-27 01:34:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:39.355900 | orchestrator | 2025-09-27 01:34:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:39.355975 | orchestrator | 2025-09-27 01:34:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:42.400518 | orchestrator | 2025-09-27 01:34:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:42.402118 | orchestrator | 2025-09-27 01:34:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:42.402145 | orchestrator | 2025-09-27 01:34:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:45.447645 | orchestrator | 2025-09-27 01:34:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:45.447709 | orchestrator | 2025-09-27 01:34:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:45.447723 | orchestrator | 2025-09-27 01:34:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:48.492865 | orchestrator | 2025-09-27 01:34:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:48.494721 | orchestrator | 2025-09-27 01:34:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:48.495012 | orchestrator | 2025-09-27 01:34:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:51.543216 | orchestrator | 2025-09-27 01:34:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:51.545376 | orchestrator | 2025-09-27 01:34:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:51.545412 | orchestrator | 2025-09-27 01:34:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:54.596064 | orchestrator | 2025-09-27 01:34:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:54.598923 | orchestrator | 2025-09-27 01:34:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:54.598956 | orchestrator | 2025-09-27 01:34:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:34:57.644758 | orchestrator | 2025-09-27 01:34:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:34:57.646220 | orchestrator | 2025-09-27 01:34:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:34:57.646245 | orchestrator | 2025-09-27 01:34:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:00.694928 | orchestrator | 2025-09-27 01:35:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:00.697199 | orchestrator | 2025-09-27 01:35:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:00.697450 | orchestrator | 2025-09-27 01:35:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:03.741774 | orchestrator | 2025-09-27 01:35:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:03.744333 | orchestrator | 2025-09-27 01:35:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:03.744365 | orchestrator | 2025-09-27 01:35:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:06.797284 | orchestrator | 2025-09-27 01:35:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:06.799592 | orchestrator | 2025-09-27 01:35:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:06.799919 | orchestrator | 2025-09-27 01:35:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:09.843091 | orchestrator | 2025-09-27 01:35:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:09.845606 | orchestrator | 2025-09-27 01:35:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:09.845702 | orchestrator | 2025-09-27 01:35:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:12.894090 | orchestrator | 2025-09-27 01:35:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:12.895081 | orchestrator | 2025-09-27 01:35:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:12.895175 | orchestrator | 2025-09-27 01:35:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:15.942452 | orchestrator | 2025-09-27 01:35:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:15.944237 | orchestrator | 2025-09-27 01:35:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:15.944261 | orchestrator | 2025-09-27 01:35:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:18.984695 | orchestrator | 2025-09-27 01:35:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:18.987295 | orchestrator | 2025-09-27 01:35:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:18.987471 | orchestrator | 2025-09-27 01:35:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:22.034153 | orchestrator | 2025-09-27 01:35:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:22.036635 | orchestrator | 2025-09-27 01:35:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:22.036665 | orchestrator | 2025-09-27 01:35:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:25.080922 | orchestrator | 2025-09-27 01:35:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:25.082903 | orchestrator | 2025-09-27 01:35:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:25.082950 | orchestrator | 2025-09-27 01:35:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:28.130580 | orchestrator | 2025-09-27 01:35:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:28.132181 | orchestrator | 2025-09-27 01:35:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:28.132206 | orchestrator | 2025-09-27 01:35:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:31.175551 | orchestrator | 2025-09-27 01:35:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:31.177389 | orchestrator | 2025-09-27 01:35:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:31.177423 | orchestrator | 2025-09-27 01:35:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:34.219298 | orchestrator | 2025-09-27 01:35:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:34.220447 | orchestrator | 2025-09-27 01:35:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:34.220560 | orchestrator | 2025-09-27 01:35:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:37.272149 | orchestrator | 2025-09-27 01:35:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:37.273767 | orchestrator | 2025-09-27 01:35:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:37.273888 | orchestrator | 2025-09-27 01:35:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:40.320161 | orchestrator | 2025-09-27 01:35:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:40.321731 | orchestrator | 2025-09-27 01:35:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:40.321778 | orchestrator | 2025-09-27 01:35:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:43.367596 | orchestrator | 2025-09-27 01:35:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:43.369609 | orchestrator | 2025-09-27 01:35:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:43.369726 | orchestrator | 2025-09-27 01:35:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:46.415245 | orchestrator | 2025-09-27 01:35:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:46.416844 | orchestrator | 2025-09-27 01:35:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:46.416876 | orchestrator | 2025-09-27 01:35:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:49.461504 | orchestrator | 2025-09-27 01:35:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:49.463786 | orchestrator | 2025-09-27 01:35:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:49.463845 | orchestrator | 2025-09-27 01:35:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:52.513050 | orchestrator | 2025-09-27 01:35:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:52.514099 | orchestrator | 2025-09-27 01:35:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:52.514185 | orchestrator | 2025-09-27 01:35:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:55.558709 | orchestrator | 2025-09-27 01:35:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:55.560581 | orchestrator | 2025-09-27 01:35:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:55.560623 | orchestrator | 2025-09-27 01:35:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:35:58.603308 | orchestrator | 2025-09-27 01:35:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:35:58.604572 | orchestrator | 2025-09-27 01:35:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:35:58.604856 | orchestrator | 2025-09-27 01:35:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:01.652290 | orchestrator | 2025-09-27 01:36:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:01.654273 | orchestrator | 2025-09-27 01:36:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:01.654302 | orchestrator | 2025-09-27 01:36:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:04.707420 | orchestrator | 2025-09-27 01:36:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:04.708921 | orchestrator | 2025-09-27 01:36:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:04.709525 | orchestrator | 2025-09-27 01:36:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:07.758898 | orchestrator | 2025-09-27 01:36:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:07.761982 | orchestrator | 2025-09-27 01:36:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:07.762128 | orchestrator | 2025-09-27 01:36:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:10.808404 | orchestrator | 2025-09-27 01:36:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:10.811665 | orchestrator | 2025-09-27 01:36:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:10.811699 | orchestrator | 2025-09-27 01:36:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:13.867006 | orchestrator | 2025-09-27 01:36:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:13.868167 | orchestrator | 2025-09-27 01:36:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:13.868199 | orchestrator | 2025-09-27 01:36:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:16.917115 | orchestrator | 2025-09-27 01:36:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:16.921346 | orchestrator | 2025-09-27 01:36:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:16.921383 | orchestrator | 2025-09-27 01:36:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:19.969184 | orchestrator | 2025-09-27 01:36:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:19.972474 | orchestrator | 2025-09-27 01:36:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:19.972533 | orchestrator | 2025-09-27 01:36:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:23.025211 | orchestrator | 2025-09-27 01:36:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:23.026781 | orchestrator | 2025-09-27 01:36:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:23.026875 | orchestrator | 2025-09-27 01:36:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:26.066772 | orchestrator | 2025-09-27 01:36:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:26.068790 | orchestrator | 2025-09-27 01:36:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:26.068946 | orchestrator | 2025-09-27 01:36:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:29.115376 | orchestrator | 2025-09-27 01:36:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:29.116974 | orchestrator | 2025-09-27 01:36:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:29.116997 | orchestrator | 2025-09-27 01:36:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:32.165026 | orchestrator | 2025-09-27 01:36:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:32.165121 | orchestrator | 2025-09-27 01:36:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:32.165137 | orchestrator | 2025-09-27 01:36:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:35.208953 | orchestrator | 2025-09-27 01:36:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:35.210885 | orchestrator | 2025-09-27 01:36:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:35.210930 | orchestrator | 2025-09-27 01:36:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:38.262307 | orchestrator | 2025-09-27 01:36:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:38.264478 | orchestrator | 2025-09-27 01:36:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:38.264513 | orchestrator | 2025-09-27 01:36:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:41.311618 | orchestrator | 2025-09-27 01:36:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:41.313651 | orchestrator | 2025-09-27 01:36:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:41.313763 | orchestrator | 2025-09-27 01:36:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:44.360982 | orchestrator | 2025-09-27 01:36:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:44.362217 | orchestrator | 2025-09-27 01:36:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:44.362247 | orchestrator | 2025-09-27 01:36:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:47.402735 | orchestrator | 2025-09-27 01:36:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:47.405050 | orchestrator | 2025-09-27 01:36:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:47.405085 | orchestrator | 2025-09-27 01:36:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:50.455313 | orchestrator | 2025-09-27 01:36:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:50.457046 | orchestrator | 2025-09-27 01:36:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:50.457437 | orchestrator | 2025-09-27 01:36:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:53.507299 | orchestrator | 2025-09-27 01:36:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:53.509554 | orchestrator | 2025-09-27 01:36:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:53.509589 | orchestrator | 2025-09-27 01:36:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:56.553924 | orchestrator | 2025-09-27 01:36:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:56.555455 | orchestrator | 2025-09-27 01:36:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:56.555706 | orchestrator | 2025-09-27 01:36:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:36:59.600758 | orchestrator | 2025-09-27 01:36:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:36:59.602648 | orchestrator | 2025-09-27 01:36:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:36:59.602736 | orchestrator | 2025-09-27 01:36:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:02.645439 | orchestrator | 2025-09-27 01:37:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:02.647311 | orchestrator | 2025-09-27 01:37:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:02.647346 | orchestrator | 2025-09-27 01:37:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:05.695439 | orchestrator | 2025-09-27 01:37:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:05.696729 | orchestrator | 2025-09-27 01:37:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:05.696898 | orchestrator | 2025-09-27 01:37:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:08.747593 | orchestrator | 2025-09-27 01:37:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:08.748802 | orchestrator | 2025-09-27 01:37:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:08.749016 | orchestrator | 2025-09-27 01:37:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:11.797658 | orchestrator | 2025-09-27 01:37:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:11.801283 | orchestrator | 2025-09-27 01:37:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:11.801316 | orchestrator | 2025-09-27 01:37:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:14.849371 | orchestrator | 2025-09-27 01:37:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:14.850953 | orchestrator | 2025-09-27 01:37:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:14.851224 | orchestrator | 2025-09-27 01:37:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:17.898067 | orchestrator | 2025-09-27 01:37:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:17.899781 | orchestrator | 2025-09-27 01:37:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:17.899876 | orchestrator | 2025-09-27 01:37:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:20.945375 | orchestrator | 2025-09-27 01:37:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:20.946999 | orchestrator | 2025-09-27 01:37:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:20.947164 | orchestrator | 2025-09-27 01:37:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:23.992863 | orchestrator | 2025-09-27 01:37:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:23.994384 | orchestrator | 2025-09-27 01:37:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:23.994460 | orchestrator | 2025-09-27 01:37:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:27.042942 | orchestrator | 2025-09-27 01:37:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:27.044284 | orchestrator | 2025-09-27 01:37:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:27.044367 | orchestrator | 2025-09-27 01:37:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:30.092187 | orchestrator | 2025-09-27 01:37:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:30.093635 | orchestrator | 2025-09-27 01:37:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:30.093679 | orchestrator | 2025-09-27 01:37:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:33.134116 | orchestrator | 2025-09-27 01:37:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:33.135561 | orchestrator | 2025-09-27 01:37:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:33.135591 | orchestrator | 2025-09-27 01:37:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:36.184891 | orchestrator | 2025-09-27 01:37:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:36.187490 | orchestrator | 2025-09-27 01:37:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:36.187519 | orchestrator | 2025-09-27 01:37:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:39.233734 | orchestrator | 2025-09-27 01:37:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:39.235376 | orchestrator | 2025-09-27 01:37:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:39.235575 | orchestrator | 2025-09-27 01:37:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:42.277651 | orchestrator | 2025-09-27 01:37:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:42.279025 | orchestrator | 2025-09-27 01:37:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:42.279493 | orchestrator | 2025-09-27 01:37:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:45.327740 | orchestrator | 2025-09-27 01:37:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:45.328948 | orchestrator | 2025-09-27 01:37:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:45.329079 | orchestrator | 2025-09-27 01:37:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:48.370288 | orchestrator | 2025-09-27 01:37:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:48.371772 | orchestrator | 2025-09-27 01:37:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:48.372070 | orchestrator | 2025-09-27 01:37:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:51.411753 | orchestrator | 2025-09-27 01:37:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:51.413234 | orchestrator | 2025-09-27 01:37:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:51.413266 | orchestrator | 2025-09-27 01:37:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:54.467485 | orchestrator | 2025-09-27 01:37:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:54.470113 | orchestrator | 2025-09-27 01:37:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:54.470142 | orchestrator | 2025-09-27 01:37:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:37:57.504353 | orchestrator | 2025-09-27 01:37:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:37:57.507763 | orchestrator | 2025-09-27 01:37:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:37:57.507878 | orchestrator | 2025-09-27 01:37:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:00.554290 | orchestrator | 2025-09-27 01:38:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:00.556044 | orchestrator | 2025-09-27 01:38:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:00.556077 | orchestrator | 2025-09-27 01:38:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:03.600938 | orchestrator | 2025-09-27 01:38:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:03.602554 | orchestrator | 2025-09-27 01:38:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:03.602585 | orchestrator | 2025-09-27 01:38:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:06.646998 | orchestrator | 2025-09-27 01:38:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:06.648455 | orchestrator | 2025-09-27 01:38:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:06.648506 | orchestrator | 2025-09-27 01:38:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:09.693648 | orchestrator | 2025-09-27 01:38:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:09.694964 | orchestrator | 2025-09-27 01:38:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:09.695029 | orchestrator | 2025-09-27 01:38:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:12.743679 | orchestrator | 2025-09-27 01:38:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:12.745515 | orchestrator | 2025-09-27 01:38:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:12.745550 | orchestrator | 2025-09-27 01:38:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:15.794084 | orchestrator | 2025-09-27 01:38:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:15.796116 | orchestrator | 2025-09-27 01:38:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:15.796146 | orchestrator | 2025-09-27 01:38:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:18.842219 | orchestrator | 2025-09-27 01:38:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:18.844411 | orchestrator | 2025-09-27 01:38:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:18.844435 | orchestrator | 2025-09-27 01:38:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:21.892255 | orchestrator | 2025-09-27 01:38:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:21.894739 | orchestrator | 2025-09-27 01:38:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:21.894770 | orchestrator | 2025-09-27 01:38:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:24.946161 | orchestrator | 2025-09-27 01:38:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:24.948074 | orchestrator | 2025-09-27 01:38:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:24.948115 | orchestrator | 2025-09-27 01:38:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:27.991795 | orchestrator | 2025-09-27 01:38:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:27.993258 | orchestrator | 2025-09-27 01:38:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:27.993289 | orchestrator | 2025-09-27 01:38:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:31.038481 | orchestrator | 2025-09-27 01:38:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:31.039699 | orchestrator | 2025-09-27 01:38:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:31.039726 | orchestrator | 2025-09-27 01:38:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:34.083680 | orchestrator | 2025-09-27 01:38:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:34.084439 | orchestrator | 2025-09-27 01:38:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:34.084557 | orchestrator | 2025-09-27 01:38:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:37.131274 | orchestrator | 2025-09-27 01:38:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:37.132468 | orchestrator | 2025-09-27 01:38:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:37.132496 | orchestrator | 2025-09-27 01:38:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:40.178860 | orchestrator | 2025-09-27 01:38:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:40.179960 | orchestrator | 2025-09-27 01:38:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:40.180082 | orchestrator | 2025-09-27 01:38:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:43.221416 | orchestrator | 2025-09-27 01:38:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:43.222643 | orchestrator | 2025-09-27 01:38:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:43.222846 | orchestrator | 2025-09-27 01:38:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:46.266632 | orchestrator | 2025-09-27 01:38:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:46.267901 | orchestrator | 2025-09-27 01:38:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:46.267934 | orchestrator | 2025-09-27 01:38:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:49.318209 | orchestrator | 2025-09-27 01:38:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:49.319654 | orchestrator | 2025-09-27 01:38:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:49.319752 | orchestrator | 2025-09-27 01:38:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:52.366588 | orchestrator | 2025-09-27 01:38:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:52.368727 | orchestrator | 2025-09-27 01:38:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:52.368759 | orchestrator | 2025-09-27 01:38:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:55.416956 | orchestrator | 2025-09-27 01:38:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:55.418224 | orchestrator | 2025-09-27 01:38:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:55.418255 | orchestrator | 2025-09-27 01:38:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:38:58.457095 | orchestrator | 2025-09-27 01:38:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:38:58.458506 | orchestrator | 2025-09-27 01:38:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:38:58.458587 | orchestrator | 2025-09-27 01:38:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:01.505530 | orchestrator | 2025-09-27 01:39:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:01.506993 | orchestrator | 2025-09-27 01:39:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:01.507299 | orchestrator | 2025-09-27 01:39:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:04.548052 | orchestrator | 2025-09-27 01:39:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:04.550992 | orchestrator | 2025-09-27 01:39:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:04.551074 | orchestrator | 2025-09-27 01:39:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:07.598085 | orchestrator | 2025-09-27 01:39:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:07.599497 | orchestrator | 2025-09-27 01:39:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:07.599862 | orchestrator | 2025-09-27 01:39:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:10.645747 | orchestrator | 2025-09-27 01:39:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:10.646974 | orchestrator | 2025-09-27 01:39:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:10.647240 | orchestrator | 2025-09-27 01:39:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:13.689918 | orchestrator | 2025-09-27 01:39:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:13.691171 | orchestrator | 2025-09-27 01:39:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:13.691548 | orchestrator | 2025-09-27 01:39:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:16.734902 | orchestrator | 2025-09-27 01:39:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:16.736658 | orchestrator | 2025-09-27 01:39:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:16.737011 | orchestrator | 2025-09-27 01:39:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:19.780756 | orchestrator | 2025-09-27 01:39:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:19.782264 | orchestrator | 2025-09-27 01:39:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:19.782321 | orchestrator | 2025-09-27 01:39:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:22.818685 | orchestrator | 2025-09-27 01:39:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:22.818949 | orchestrator | 2025-09-27 01:39:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:22.819053 | orchestrator | 2025-09-27 01:39:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:25.864204 | orchestrator | 2025-09-27 01:39:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:25.865974 | orchestrator | 2025-09-27 01:39:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:25.866102 | orchestrator | 2025-09-27 01:39:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:28.909656 | orchestrator | 2025-09-27 01:39:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:28.912724 | orchestrator | 2025-09-27 01:39:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:28.912831 | orchestrator | 2025-09-27 01:39:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:31.959134 | orchestrator | 2025-09-27 01:39:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:31.960862 | orchestrator | 2025-09-27 01:39:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:31.960909 | orchestrator | 2025-09-27 01:39:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:34.997737 | orchestrator | 2025-09-27 01:39:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:35.000570 | orchestrator | 2025-09-27 01:39:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:35.000904 | orchestrator | 2025-09-27 01:39:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:38.039217 | orchestrator | 2025-09-27 01:39:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:38.040553 | orchestrator | 2025-09-27 01:39:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:38.040708 | orchestrator | 2025-09-27 01:39:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:41.078269 | orchestrator | 2025-09-27 01:39:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:41.079483 | orchestrator | 2025-09-27 01:39:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:41.079675 | orchestrator | 2025-09-27 01:39:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:44.117532 | orchestrator | 2025-09-27 01:39:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:44.118487 | orchestrator | 2025-09-27 01:39:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:44.118518 | orchestrator | 2025-09-27 01:39:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:47.164940 | orchestrator | 2025-09-27 01:39:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:47.167872 | orchestrator | 2025-09-27 01:39:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:47.167945 | orchestrator | 2025-09-27 01:39:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:50.203831 | orchestrator | 2025-09-27 01:39:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:50.205143 | orchestrator | 2025-09-27 01:39:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:50.205429 | orchestrator | 2025-09-27 01:39:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:53.250230 | orchestrator | 2025-09-27 01:39:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:53.252609 | orchestrator | 2025-09-27 01:39:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:53.252662 | orchestrator | 2025-09-27 01:39:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:56.301530 | orchestrator | 2025-09-27 01:39:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:56.302818 | orchestrator | 2025-09-27 01:39:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:56.302846 | orchestrator | 2025-09-27 01:39:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:39:59.351844 | orchestrator | 2025-09-27 01:39:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:39:59.353026 | orchestrator | 2025-09-27 01:39:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:39:59.353053 | orchestrator | 2025-09-27 01:39:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:02.401517 | orchestrator | 2025-09-27 01:40:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:02.403034 | orchestrator | 2025-09-27 01:40:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:02.403112 | orchestrator | 2025-09-27 01:40:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:05.449680 | orchestrator | 2025-09-27 01:40:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:05.453123 | orchestrator | 2025-09-27 01:40:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:05.453147 | orchestrator | 2025-09-27 01:40:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:08.498992 | orchestrator | 2025-09-27 01:40:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:08.501580 | orchestrator | 2025-09-27 01:40:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:08.501702 | orchestrator | 2025-09-27 01:40:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:11.551456 | orchestrator | 2025-09-27 01:40:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:11.553654 | orchestrator | 2025-09-27 01:40:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:11.553881 | orchestrator | 2025-09-27 01:40:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:14.601783 | orchestrator | 2025-09-27 01:40:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:14.603705 | orchestrator | 2025-09-27 01:40:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:14.603734 | orchestrator | 2025-09-27 01:40:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:17.646602 | orchestrator | 2025-09-27 01:40:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:17.648463 | orchestrator | 2025-09-27 01:40:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:17.648496 | orchestrator | 2025-09-27 01:40:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:20.693332 | orchestrator | 2025-09-27 01:40:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:20.694929 | orchestrator | 2025-09-27 01:40:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:20.694962 | orchestrator | 2025-09-27 01:40:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:23.743984 | orchestrator | 2025-09-27 01:40:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:23.747690 | orchestrator | 2025-09-27 01:40:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:23.747718 | orchestrator | 2025-09-27 01:40:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:26.794114 | orchestrator | 2025-09-27 01:40:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:26.794184 | orchestrator | 2025-09-27 01:40:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:26.795462 | orchestrator | 2025-09-27 01:40:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:29.847038 | orchestrator | 2025-09-27 01:40:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:29.848324 | orchestrator | 2025-09-27 01:40:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:29.848730 | orchestrator | 2025-09-27 01:40:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:32.895090 | orchestrator | 2025-09-27 01:40:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:32.896873 | orchestrator | 2025-09-27 01:40:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:32.896912 | orchestrator | 2025-09-27 01:40:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:35.938787 | orchestrator | 2025-09-27 01:40:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:35.939696 | orchestrator | 2025-09-27 01:40:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:35.939735 | orchestrator | 2025-09-27 01:40:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:38.984090 | orchestrator | 2025-09-27 01:40:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:38.986094 | orchestrator | 2025-09-27 01:40:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:38.986168 | orchestrator | 2025-09-27 01:40:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:42.027942 | orchestrator | 2025-09-27 01:40:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:42.029933 | orchestrator | 2025-09-27 01:40:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:42.030132 | orchestrator | 2025-09-27 01:40:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:45.071230 | orchestrator | 2025-09-27 01:40:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:45.072757 | orchestrator | 2025-09-27 01:40:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:45.072779 | orchestrator | 2025-09-27 01:40:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:48.120703 | orchestrator | 2025-09-27 01:40:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:48.124196 | orchestrator | 2025-09-27 01:40:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:48.124251 | orchestrator | 2025-09-27 01:40:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:51.173156 | orchestrator | 2025-09-27 01:40:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:51.174941 | orchestrator | 2025-09-27 01:40:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:51.174969 | orchestrator | 2025-09-27 01:40:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:54.219118 | orchestrator | 2025-09-27 01:40:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:54.220573 | orchestrator | 2025-09-27 01:40:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:54.220600 | orchestrator | 2025-09-27 01:40:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:40:57.270294 | orchestrator | 2025-09-27 01:40:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:40:57.272600 | orchestrator | 2025-09-27 01:40:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:40:57.272770 | orchestrator | 2025-09-27 01:40:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:00.317356 | orchestrator | 2025-09-27 01:41:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:00.318922 | orchestrator | 2025-09-27 01:41:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:00.319039 | orchestrator | 2025-09-27 01:41:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:03.366683 | orchestrator | 2025-09-27 01:41:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:03.368239 | orchestrator | 2025-09-27 01:41:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:03.368341 | orchestrator | 2025-09-27 01:41:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:06.411189 | orchestrator | 2025-09-27 01:41:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:06.413388 | orchestrator | 2025-09-27 01:41:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:06.413711 | orchestrator | 2025-09-27 01:41:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:09.456347 | orchestrator | 2025-09-27 01:41:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:09.458355 | orchestrator | 2025-09-27 01:41:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:09.458482 | orchestrator | 2025-09-27 01:41:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:12.509242 | orchestrator | 2025-09-27 01:41:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:12.510630 | orchestrator | 2025-09-27 01:41:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:12.510680 | orchestrator | 2025-09-27 01:41:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:15.554490 | orchestrator | 2025-09-27 01:41:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:15.556065 | orchestrator | 2025-09-27 01:41:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:15.556098 | orchestrator | 2025-09-27 01:41:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:18.601735 | orchestrator | 2025-09-27 01:41:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:18.603339 | orchestrator | 2025-09-27 01:41:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:18.603374 | orchestrator | 2025-09-27 01:41:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:21.648461 | orchestrator | 2025-09-27 01:41:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:21.650009 | orchestrator | 2025-09-27 01:41:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:21.650297 | orchestrator | 2025-09-27 01:41:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:24.695059 | orchestrator | 2025-09-27 01:41:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:24.697563 | orchestrator | 2025-09-27 01:41:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:24.697880 | orchestrator | 2025-09-27 01:41:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:27.745253 | orchestrator | 2025-09-27 01:41:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:27.745904 | orchestrator | 2025-09-27 01:41:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:27.746206 | orchestrator | 2025-09-27 01:41:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:30.793928 | orchestrator | 2025-09-27 01:41:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:30.795472 | orchestrator | 2025-09-27 01:41:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:30.795901 | orchestrator | 2025-09-27 01:41:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:33.843266 | orchestrator | 2025-09-27 01:41:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:33.844682 | orchestrator | 2025-09-27 01:41:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:33.844718 | orchestrator | 2025-09-27 01:41:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:36.883023 | orchestrator | 2025-09-27 01:41:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:36.883970 | orchestrator | 2025-09-27 01:41:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:36.884410 | orchestrator | 2025-09-27 01:41:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:39.927148 | orchestrator | 2025-09-27 01:41:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:39.929647 | orchestrator | 2025-09-27 01:41:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:39.929675 | orchestrator | 2025-09-27 01:41:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:42.980339 | orchestrator | 2025-09-27 01:41:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:42.983377 | orchestrator | 2025-09-27 01:41:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:42.983410 | orchestrator | 2025-09-27 01:41:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:46.038599 | orchestrator | 2025-09-27 01:41:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:46.040919 | orchestrator | 2025-09-27 01:41:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:46.040989 | orchestrator | 2025-09-27 01:41:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:49.094640 | orchestrator | 2025-09-27 01:41:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:49.097153 | orchestrator | 2025-09-27 01:41:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:49.097438 | orchestrator | 2025-09-27 01:41:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:52.147156 | orchestrator | 2025-09-27 01:41:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:52.150204 | orchestrator | 2025-09-27 01:41:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:52.150283 | orchestrator | 2025-09-27 01:41:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:55.198967 | orchestrator | 2025-09-27 01:41:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:55.200640 | orchestrator | 2025-09-27 01:41:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:55.200675 | orchestrator | 2025-09-27 01:41:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:41:58.247160 | orchestrator | 2025-09-27 01:41:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:41:58.248924 | orchestrator | 2025-09-27 01:41:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:41:58.249002 | orchestrator | 2025-09-27 01:41:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:01.295239 | orchestrator | 2025-09-27 01:42:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:01.297204 | orchestrator | 2025-09-27 01:42:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:01.297235 | orchestrator | 2025-09-27 01:42:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:04.346756 | orchestrator | 2025-09-27 01:42:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:04.347363 | orchestrator | 2025-09-27 01:42:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:04.347395 | orchestrator | 2025-09-27 01:42:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:07.398860 | orchestrator | 2025-09-27 01:42:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:07.399572 | orchestrator | 2025-09-27 01:42:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:07.399827 | orchestrator | 2025-09-27 01:42:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:10.445310 | orchestrator | 2025-09-27 01:42:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:10.447781 | orchestrator | 2025-09-27 01:42:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:10.448131 | orchestrator | 2025-09-27 01:42:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:13.496609 | orchestrator | 2025-09-27 01:42:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:13.498521 | orchestrator | 2025-09-27 01:42:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:13.498547 | orchestrator | 2025-09-27 01:42:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:16.543171 | orchestrator | 2025-09-27 01:42:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:16.544063 | orchestrator | 2025-09-27 01:42:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:16.544094 | orchestrator | 2025-09-27 01:42:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:19.589750 | orchestrator | 2025-09-27 01:42:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:19.592534 | orchestrator | 2025-09-27 01:42:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:19.592575 | orchestrator | 2025-09-27 01:42:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:22.639988 | orchestrator | 2025-09-27 01:42:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:22.642003 | orchestrator | 2025-09-27 01:42:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:22.642292 | orchestrator | 2025-09-27 01:42:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:25.685557 | orchestrator | 2025-09-27 01:42:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:25.688163 | orchestrator | 2025-09-27 01:42:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:25.688271 | orchestrator | 2025-09-27 01:42:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:28.733199 | orchestrator | 2025-09-27 01:42:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:28.734288 | orchestrator | 2025-09-27 01:42:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:28.734317 | orchestrator | 2025-09-27 01:42:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:31.780598 | orchestrator | 2025-09-27 01:42:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:31.782696 | orchestrator | 2025-09-27 01:42:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:31.782728 | orchestrator | 2025-09-27 01:42:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:34.831874 | orchestrator | 2025-09-27 01:42:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:34.833040 | orchestrator | 2025-09-27 01:42:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:34.833203 | orchestrator | 2025-09-27 01:42:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:37.863277 | orchestrator | 2025-09-27 01:42:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:37.863850 | orchestrator | 2025-09-27 01:42:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:37.863885 | orchestrator | 2025-09-27 01:42:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:40.909431 | orchestrator | 2025-09-27 01:42:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:40.912575 | orchestrator | 2025-09-27 01:42:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:40.912609 | orchestrator | 2025-09-27 01:42:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:43.954111 | orchestrator | 2025-09-27 01:42:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:43.955784 | orchestrator | 2025-09-27 01:42:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:43.955872 | orchestrator | 2025-09-27 01:42:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:47.001745 | orchestrator | 2025-09-27 01:42:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:47.003491 | orchestrator | 2025-09-27 01:42:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:47.003611 | orchestrator | 2025-09-27 01:42:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:50.050572 | orchestrator | 2025-09-27 01:42:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:50.052292 | orchestrator | 2025-09-27 01:42:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:50.052384 | orchestrator | 2025-09-27 01:42:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:53.094455 | orchestrator | 2025-09-27 01:42:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:53.095621 | orchestrator | 2025-09-27 01:42:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:53.095654 | orchestrator | 2025-09-27 01:42:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:56.132588 | orchestrator | 2025-09-27 01:42:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:56.134826 | orchestrator | 2025-09-27 01:42:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:56.134861 | orchestrator | 2025-09-27 01:42:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:42:59.181189 | orchestrator | 2025-09-27 01:42:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:42:59.182521 | orchestrator | 2025-09-27 01:42:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:42:59.182555 | orchestrator | 2025-09-27 01:42:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:02.227560 | orchestrator | 2025-09-27 01:43:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:02.228975 | orchestrator | 2025-09-27 01:43:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:02.229012 | orchestrator | 2025-09-27 01:43:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:05.276517 | orchestrator | 2025-09-27 01:43:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:05.277369 | orchestrator | 2025-09-27 01:43:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:05.277776 | orchestrator | 2025-09-27 01:43:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:08.325102 | orchestrator | 2025-09-27 01:43:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:08.326468 | orchestrator | 2025-09-27 01:43:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:08.326558 | orchestrator | 2025-09-27 01:43:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:11.372081 | orchestrator | 2025-09-27 01:43:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:11.373913 | orchestrator | 2025-09-27 01:43:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:11.373943 | orchestrator | 2025-09-27 01:43:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:14.418230 | orchestrator | 2025-09-27 01:43:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:14.419557 | orchestrator | 2025-09-27 01:43:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:14.419846 | orchestrator | 2025-09-27 01:43:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:17.457279 | orchestrator | 2025-09-27 01:43:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:17.459103 | orchestrator | 2025-09-27 01:43:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:17.459142 | orchestrator | 2025-09-27 01:43:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:20.503220 | orchestrator | 2025-09-27 01:43:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:20.505950 | orchestrator | 2025-09-27 01:43:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:20.505982 | orchestrator | 2025-09-27 01:43:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:23.552180 | orchestrator | 2025-09-27 01:43:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:23.553698 | orchestrator | 2025-09-27 01:43:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:23.553741 | orchestrator | 2025-09-27 01:43:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:26.605545 | orchestrator | 2025-09-27 01:43:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:26.608461 | orchestrator | 2025-09-27 01:43:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:26.608768 | orchestrator | 2025-09-27 01:43:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:29.654260 | orchestrator | 2025-09-27 01:43:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:29.655771 | orchestrator | 2025-09-27 01:43:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:29.655840 | orchestrator | 2025-09-27 01:43:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:32.702174 | orchestrator | 2025-09-27 01:43:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:32.703434 | orchestrator | 2025-09-27 01:43:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:32.703465 | orchestrator | 2025-09-27 01:43:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:35.753128 | orchestrator | 2025-09-27 01:43:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:35.754532 | orchestrator | 2025-09-27 01:43:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:35.754566 | orchestrator | 2025-09-27 01:43:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:38.800496 | orchestrator | 2025-09-27 01:43:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:38.801263 | orchestrator | 2025-09-27 01:43:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:38.801296 | orchestrator | 2025-09-27 01:43:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:41.847702 | orchestrator | 2025-09-27 01:43:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:41.849614 | orchestrator | 2025-09-27 01:43:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:41.849647 | orchestrator | 2025-09-27 01:43:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:44.898394 | orchestrator | 2025-09-27 01:43:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:44.899605 | orchestrator | 2025-09-27 01:43:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:44.899636 | orchestrator | 2025-09-27 01:43:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:47.948480 | orchestrator | 2025-09-27 01:43:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:47.950118 | orchestrator | 2025-09-27 01:43:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:47.950202 | orchestrator | 2025-09-27 01:43:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:51.002245 | orchestrator | 2025-09-27 01:43:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:51.004474 | orchestrator | 2025-09-27 01:43:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:51.004579 | orchestrator | 2025-09-27 01:43:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:54.053730 | orchestrator | 2025-09-27 01:43:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:54.056323 | orchestrator | 2025-09-27 01:43:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:54.056404 | orchestrator | 2025-09-27 01:43:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:43:57.094004 | orchestrator | 2025-09-27 01:43:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:43:57.095857 | orchestrator | 2025-09-27 01:43:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:43:57.095889 | orchestrator | 2025-09-27 01:43:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:00.138070 | orchestrator | 2025-09-27 01:44:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:00.139178 | orchestrator | 2025-09-27 01:44:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:00.139572 | orchestrator | 2025-09-27 01:44:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:03.187650 | orchestrator | 2025-09-27 01:44:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:03.189572 | orchestrator | 2025-09-27 01:44:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:03.189601 | orchestrator | 2025-09-27 01:44:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:06.235498 | orchestrator | 2025-09-27 01:44:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:06.236201 | orchestrator | 2025-09-27 01:44:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:06.236310 | orchestrator | 2025-09-27 01:44:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:09.277086 | orchestrator | 2025-09-27 01:44:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:09.278877 | orchestrator | 2025-09-27 01:44:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:09.278956 | orchestrator | 2025-09-27 01:44:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:12.327777 | orchestrator | 2025-09-27 01:44:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:12.328044 | orchestrator | 2025-09-27 01:44:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:12.328065 | orchestrator | 2025-09-27 01:44:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:15.375233 | orchestrator | 2025-09-27 01:44:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:15.377456 | orchestrator | 2025-09-27 01:44:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:15.377524 | orchestrator | 2025-09-27 01:44:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:18.428347 | orchestrator | 2025-09-27 01:44:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:18.430307 | orchestrator | 2025-09-27 01:44:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:18.430339 | orchestrator | 2025-09-27 01:44:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:21.484685 | orchestrator | 2025-09-27 01:44:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:21.485614 | orchestrator | 2025-09-27 01:44:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:21.485733 | orchestrator | 2025-09-27 01:44:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:24.526391 | orchestrator | 2025-09-27 01:44:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:24.527782 | orchestrator | 2025-09-27 01:44:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:24.527890 | orchestrator | 2025-09-27 01:44:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:27.568319 | orchestrator | 2025-09-27 01:44:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:27.569994 | orchestrator | 2025-09-27 01:44:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:27.570087 | orchestrator | 2025-09-27 01:44:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:30.620038 | orchestrator | 2025-09-27 01:44:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:30.620138 | orchestrator | 2025-09-27 01:44:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:30.620154 | orchestrator | 2025-09-27 01:44:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:33.661982 | orchestrator | 2025-09-27 01:44:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:33.663470 | orchestrator | 2025-09-27 01:44:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:33.663529 | orchestrator | 2025-09-27 01:44:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:36.717396 | orchestrator | 2025-09-27 01:44:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:36.718399 | orchestrator | 2025-09-27 01:44:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:36.718963 | orchestrator | 2025-09-27 01:44:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:39.772495 | orchestrator | 2025-09-27 01:44:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:39.773286 | orchestrator | 2025-09-27 01:44:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:39.773332 | orchestrator | 2025-09-27 01:44:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:42.830225 | orchestrator | 2025-09-27 01:44:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:42.830873 | orchestrator | 2025-09-27 01:44:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:42.830906 | orchestrator | 2025-09-27 01:44:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:45.882297 | orchestrator | 2025-09-27 01:44:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:45.882640 | orchestrator | 2025-09-27 01:44:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:45.882670 | orchestrator | 2025-09-27 01:44:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:48.932715 | orchestrator | 2025-09-27 01:44:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:48.934468 | orchestrator | 2025-09-27 01:44:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:48.934495 | orchestrator | 2025-09-27 01:44:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:51.986646 | orchestrator | 2025-09-27 01:44:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:51.988271 | orchestrator | 2025-09-27 01:44:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:51.988304 | orchestrator | 2025-09-27 01:44:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:55.036057 | orchestrator | 2025-09-27 01:44:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:55.036258 | orchestrator | 2025-09-27 01:44:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:55.036282 | orchestrator | 2025-09-27 01:44:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:44:58.089378 | orchestrator | 2025-09-27 01:44:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:44:58.090451 | orchestrator | 2025-09-27 01:44:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:44:58.090487 | orchestrator | 2025-09-27 01:44:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:01.185876 | orchestrator | 2025-09-27 01:45:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:01.187851 | orchestrator | 2025-09-27 01:45:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:01.187945 | orchestrator | 2025-09-27 01:45:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:04.237118 | orchestrator | 2025-09-27 01:45:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:04.239926 | orchestrator | 2025-09-27 01:45:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:04.240011 | orchestrator | 2025-09-27 01:45:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:07.301210 | orchestrator | 2025-09-27 01:45:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:07.303321 | orchestrator | 2025-09-27 01:45:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:07.303561 | orchestrator | 2025-09-27 01:45:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:10.359711 | orchestrator | 2025-09-27 01:45:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:10.360907 | orchestrator | 2025-09-27 01:45:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:10.360937 | orchestrator | 2025-09-27 01:45:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:13.429431 | orchestrator | 2025-09-27 01:45:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:13.432252 | orchestrator | 2025-09-27 01:45:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:13.432349 | orchestrator | 2025-09-27 01:45:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:16.481932 | orchestrator | 2025-09-27 01:45:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:16.483953 | orchestrator | 2025-09-27 01:45:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:16.483988 | orchestrator | 2025-09-27 01:45:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:19.548771 | orchestrator | 2025-09-27 01:45:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:19.551020 | orchestrator | 2025-09-27 01:45:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:19.551056 | orchestrator | 2025-09-27 01:45:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:22.605404 | orchestrator | 2025-09-27 01:45:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:22.607251 | orchestrator | 2025-09-27 01:45:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:22.607292 | orchestrator | 2025-09-27 01:45:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:25.654267 | orchestrator | 2025-09-27 01:45:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:25.655847 | orchestrator | 2025-09-27 01:45:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:25.655878 | orchestrator | 2025-09-27 01:45:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:28.705003 | orchestrator | 2025-09-27 01:45:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:28.707596 | orchestrator | 2025-09-27 01:45:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:28.707654 | orchestrator | 2025-09-27 01:45:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:31.754233 | orchestrator | 2025-09-27 01:45:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:31.755597 | orchestrator | 2025-09-27 01:45:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:31.755634 | orchestrator | 2025-09-27 01:45:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:34.801053 | orchestrator | 2025-09-27 01:45:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:34.802195 | orchestrator | 2025-09-27 01:45:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:34.802460 | orchestrator | 2025-09-27 01:45:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:37.854514 | orchestrator | 2025-09-27 01:45:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:37.855902 | orchestrator | 2025-09-27 01:45:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:37.856133 | orchestrator | 2025-09-27 01:45:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:40.907596 | orchestrator | 2025-09-27 01:45:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:40.909195 | orchestrator | 2025-09-27 01:45:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:40.909695 | orchestrator | 2025-09-27 01:45:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:43.959106 | orchestrator | 2025-09-27 01:45:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:43.959902 | orchestrator | 2025-09-27 01:45:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:43.959983 | orchestrator | 2025-09-27 01:45:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:47.005428 | orchestrator | 2025-09-27 01:45:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:47.006631 | orchestrator | 2025-09-27 01:45:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:47.006654 | orchestrator | 2025-09-27 01:45:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:50.058925 | orchestrator | 2025-09-27 01:45:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:50.059990 | orchestrator | 2025-09-27 01:45:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:50.060019 | orchestrator | 2025-09-27 01:45:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:53.101168 | orchestrator | 2025-09-27 01:45:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:53.106662 | orchestrator | 2025-09-27 01:45:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:53.106699 | orchestrator | 2025-09-27 01:45:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:56.153208 | orchestrator | 2025-09-27 01:45:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:56.154995 | orchestrator | 2025-09-27 01:45:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:56.155144 | orchestrator | 2025-09-27 01:45:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:45:59.203141 | orchestrator | 2025-09-27 01:45:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:45:59.204117 | orchestrator | 2025-09-27 01:45:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:45:59.204393 | orchestrator | 2025-09-27 01:45:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:02.255557 | orchestrator | 2025-09-27 01:46:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:02.257928 | orchestrator | 2025-09-27 01:46:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:02.257977 | orchestrator | 2025-09-27 01:46:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:05.316614 | orchestrator | 2025-09-27 01:46:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:05.319397 | orchestrator | 2025-09-27 01:46:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:05.319424 | orchestrator | 2025-09-27 01:46:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:08.367170 | orchestrator | 2025-09-27 01:46:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:08.369388 | orchestrator | 2025-09-27 01:46:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:08.369472 | orchestrator | 2025-09-27 01:46:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:11.421109 | orchestrator | 2025-09-27 01:46:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:11.422136 | orchestrator | 2025-09-27 01:46:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:11.423269 | orchestrator | 2025-09-27 01:46:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:14.468095 | orchestrator | 2025-09-27 01:46:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:14.469969 | orchestrator | 2025-09-27 01:46:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:14.470067 | orchestrator | 2025-09-27 01:46:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:17.516490 | orchestrator | 2025-09-27 01:46:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:17.517154 | orchestrator | 2025-09-27 01:46:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:17.517238 | orchestrator | 2025-09-27 01:46:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:20.560379 | orchestrator | 2025-09-27 01:46:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:20.561593 | orchestrator | 2025-09-27 01:46:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:20.561619 | orchestrator | 2025-09-27 01:46:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:23.605284 | orchestrator | 2025-09-27 01:46:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:23.606854 | orchestrator | 2025-09-27 01:46:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:23.607080 | orchestrator | 2025-09-27 01:46:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:26.649918 | orchestrator | 2025-09-27 01:46:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:26.651349 | orchestrator | 2025-09-27 01:46:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:26.651376 | orchestrator | 2025-09-27 01:46:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:29.696016 | orchestrator | 2025-09-27 01:46:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:29.697526 | orchestrator | 2025-09-27 01:46:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:29.697601 | orchestrator | 2025-09-27 01:46:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:32.740521 | orchestrator | 2025-09-27 01:46:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:32.741945 | orchestrator | 2025-09-27 01:46:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:32.742438 | orchestrator | 2025-09-27 01:46:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:35.787408 | orchestrator | 2025-09-27 01:46:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:35.787997 | orchestrator | 2025-09-27 01:46:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:35.788029 | orchestrator | 2025-09-27 01:46:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:38.836196 | orchestrator | 2025-09-27 01:46:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:38.837882 | orchestrator | 2025-09-27 01:46:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:38.837927 | orchestrator | 2025-09-27 01:46:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:41.881479 | orchestrator | 2025-09-27 01:46:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:41.882480 | orchestrator | 2025-09-27 01:46:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:41.882657 | orchestrator | 2025-09-27 01:46:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:44.924676 | orchestrator | 2025-09-27 01:46:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:44.925942 | orchestrator | 2025-09-27 01:46:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:44.926077 | orchestrator | 2025-09-27 01:46:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:47.968775 | orchestrator | 2025-09-27 01:46:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:47.970152 | orchestrator | 2025-09-27 01:46:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:47.970188 | orchestrator | 2025-09-27 01:46:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:51.007875 | orchestrator | 2025-09-27 01:46:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:51.008919 | orchestrator | 2025-09-27 01:46:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:51.009024 | orchestrator | 2025-09-27 01:46:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:54.051449 | orchestrator | 2025-09-27 01:46:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:54.051667 | orchestrator | 2025-09-27 01:46:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:54.052143 | orchestrator | 2025-09-27 01:46:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:46:57.094347 | orchestrator | 2025-09-27 01:46:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:46:57.096302 | orchestrator | 2025-09-27 01:46:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:46:57.096331 | orchestrator | 2025-09-27 01:46:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:00.148396 | orchestrator | 2025-09-27 01:47:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:00.148491 | orchestrator | 2025-09-27 01:47:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:00.148500 | orchestrator | 2025-09-27 01:47:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:03.192075 | orchestrator | 2025-09-27 01:47:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:03.193214 | orchestrator | 2025-09-27 01:47:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:03.193241 | orchestrator | 2025-09-27 01:47:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:06.239749 | orchestrator | 2025-09-27 01:47:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:06.241326 | orchestrator | 2025-09-27 01:47:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:06.241350 | orchestrator | 2025-09-27 01:47:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:09.283733 | orchestrator | 2025-09-27 01:47:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:09.285217 | orchestrator | 2025-09-27 01:47:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:09.285243 | orchestrator | 2025-09-27 01:47:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:12.331475 | orchestrator | 2025-09-27 01:47:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:12.333037 | orchestrator | 2025-09-27 01:47:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:12.333073 | orchestrator | 2025-09-27 01:47:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:15.378278 | orchestrator | 2025-09-27 01:47:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:15.380085 | orchestrator | 2025-09-27 01:47:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:15.380114 | orchestrator | 2025-09-27 01:47:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:18.427010 | orchestrator | 2025-09-27 01:47:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:18.428355 | orchestrator | 2025-09-27 01:47:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:18.428774 | orchestrator | 2025-09-27 01:47:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:21.475028 | orchestrator | 2025-09-27 01:47:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:21.476638 | orchestrator | 2025-09-27 01:47:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:21.476671 | orchestrator | 2025-09-27 01:47:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:24.523512 | orchestrator | 2025-09-27 01:47:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:24.524570 | orchestrator | 2025-09-27 01:47:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:24.524600 | orchestrator | 2025-09-27 01:47:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:27.566910 | orchestrator | 2025-09-27 01:47:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:27.570250 | orchestrator | 2025-09-27 01:47:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:27.570280 | orchestrator | 2025-09-27 01:47:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:30.617425 | orchestrator | 2025-09-27 01:47:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:30.619450 | orchestrator | 2025-09-27 01:47:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:30.619564 | orchestrator | 2025-09-27 01:47:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:33.666579 | orchestrator | 2025-09-27 01:47:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:33.668223 | orchestrator | 2025-09-27 01:47:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:33.668246 | orchestrator | 2025-09-27 01:47:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:36.704418 | orchestrator | 2025-09-27 01:47:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:36.706608 | orchestrator | 2025-09-27 01:47:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:36.706654 | orchestrator | 2025-09-27 01:47:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:39.747759 | orchestrator | 2025-09-27 01:47:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:39.748543 | orchestrator | 2025-09-27 01:47:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:39.748575 | orchestrator | 2025-09-27 01:47:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:42.796084 | orchestrator | 2025-09-27 01:47:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:42.797888 | orchestrator | 2025-09-27 01:47:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:42.798192 | orchestrator | 2025-09-27 01:47:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:45.839477 | orchestrator | 2025-09-27 01:47:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:45.839964 | orchestrator | 2025-09-27 01:47:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:45.839990 | orchestrator | 2025-09-27 01:47:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:48.879703 | orchestrator | 2025-09-27 01:47:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:48.880107 | orchestrator | 2025-09-27 01:47:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:48.880139 | orchestrator | 2025-09-27 01:47:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:51.932239 | orchestrator | 2025-09-27 01:47:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:51.933929 | orchestrator | 2025-09-27 01:47:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:51.933959 | orchestrator | 2025-09-27 01:47:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:54.982834 | orchestrator | 2025-09-27 01:47:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:54.986260 | orchestrator | 2025-09-27 01:47:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:54.986290 | orchestrator | 2025-09-27 01:47:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:47:58.032882 | orchestrator | 2025-09-27 01:47:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:47:58.034308 | orchestrator | 2025-09-27 01:47:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:47:58.034345 | orchestrator | 2025-09-27 01:47:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:01.076869 | orchestrator | 2025-09-27 01:48:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:01.079134 | orchestrator | 2025-09-27 01:48:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:01.079389 | orchestrator | 2025-09-27 01:48:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:04.128872 | orchestrator | 2025-09-27 01:48:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:04.130121 | orchestrator | 2025-09-27 01:48:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:04.130143 | orchestrator | 2025-09-27 01:48:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:07.173069 | orchestrator | 2025-09-27 01:48:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:07.174450 | orchestrator | 2025-09-27 01:48:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:07.174712 | orchestrator | 2025-09-27 01:48:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:10.223110 | orchestrator | 2025-09-27 01:48:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:10.227001 | orchestrator | 2025-09-27 01:48:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:10.227030 | orchestrator | 2025-09-27 01:48:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:13.272689 | orchestrator | 2025-09-27 01:48:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:13.274689 | orchestrator | 2025-09-27 01:48:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:13.274754 | orchestrator | 2025-09-27 01:48:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:16.321346 | orchestrator | 2025-09-27 01:48:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:16.324067 | orchestrator | 2025-09-27 01:48:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:16.324098 | orchestrator | 2025-09-27 01:48:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:19.368268 | orchestrator | 2025-09-27 01:48:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:19.370446 | orchestrator | 2025-09-27 01:48:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:19.370475 | orchestrator | 2025-09-27 01:48:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:22.412322 | orchestrator | 2025-09-27 01:48:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:22.414338 | orchestrator | 2025-09-27 01:48:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:22.414365 | orchestrator | 2025-09-27 01:48:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:25.463454 | orchestrator | 2025-09-27 01:48:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:25.464855 | orchestrator | 2025-09-27 01:48:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:25.464935 | orchestrator | 2025-09-27 01:48:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:28.511653 | orchestrator | 2025-09-27 01:48:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:28.512628 | orchestrator | 2025-09-27 01:48:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:28.512817 | orchestrator | 2025-09-27 01:48:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:31.560077 | orchestrator | 2025-09-27 01:48:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:31.562171 | orchestrator | 2025-09-27 01:48:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:31.562242 | orchestrator | 2025-09-27 01:48:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:34.605444 | orchestrator | 2025-09-27 01:48:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:34.607673 | orchestrator | 2025-09-27 01:48:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:34.607703 | orchestrator | 2025-09-27 01:48:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:37.650294 | orchestrator | 2025-09-27 01:48:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:37.651468 | orchestrator | 2025-09-27 01:48:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:37.651600 | orchestrator | 2025-09-27 01:48:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:40.697524 | orchestrator | 2025-09-27 01:48:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:40.698978 | orchestrator | 2025-09-27 01:48:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:40.699003 | orchestrator | 2025-09-27 01:48:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:43.742688 | orchestrator | 2025-09-27 01:48:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:43.744028 | orchestrator | 2025-09-27 01:48:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:43.744054 | orchestrator | 2025-09-27 01:48:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:46.787588 | orchestrator | 2025-09-27 01:48:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:46.789229 | orchestrator | 2025-09-27 01:48:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:46.789252 | orchestrator | 2025-09-27 01:48:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:49.841070 | orchestrator | 2025-09-27 01:48:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:49.842926 | orchestrator | 2025-09-27 01:48:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:49.842968 | orchestrator | 2025-09-27 01:48:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:52.891188 | orchestrator | 2025-09-27 01:48:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:52.892160 | orchestrator | 2025-09-27 01:48:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:52.892185 | orchestrator | 2025-09-27 01:48:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:55.939221 | orchestrator | 2025-09-27 01:48:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:55.940541 | orchestrator | 2025-09-27 01:48:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:55.940733 | orchestrator | 2025-09-27 01:48:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:48:58.983978 | orchestrator | 2025-09-27 01:48:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:48:58.984559 | orchestrator | 2025-09-27 01:48:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:48:58.984613 | orchestrator | 2025-09-27 01:48:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:02.033064 | orchestrator | 2025-09-27 01:49:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:02.035992 | orchestrator | 2025-09-27 01:49:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:02.036015 | orchestrator | 2025-09-27 01:49:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:05.078932 | orchestrator | 2025-09-27 01:49:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:05.080543 | orchestrator | 2025-09-27 01:49:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:05.080761 | orchestrator | 2025-09-27 01:49:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:08.130073 | orchestrator | 2025-09-27 01:49:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:08.132006 | orchestrator | 2025-09-27 01:49:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:08.132232 | orchestrator | 2025-09-27 01:49:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:11.185168 | orchestrator | 2025-09-27 01:49:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:11.187015 | orchestrator | 2025-09-27 01:49:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:11.187047 | orchestrator | 2025-09-27 01:49:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:14.233690 | orchestrator | 2025-09-27 01:49:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:14.235660 | orchestrator | 2025-09-27 01:49:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:14.235955 | orchestrator | 2025-09-27 01:49:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:17.284852 | orchestrator | 2025-09-27 01:49:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:17.286446 | orchestrator | 2025-09-27 01:49:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:17.286483 | orchestrator | 2025-09-27 01:49:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:20.331676 | orchestrator | 2025-09-27 01:49:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:20.332242 | orchestrator | 2025-09-27 01:49:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:20.332299 | orchestrator | 2025-09-27 01:49:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:23.379312 | orchestrator | 2025-09-27 01:49:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:23.381062 | orchestrator | 2025-09-27 01:49:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:23.381130 | orchestrator | 2025-09-27 01:49:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:26.431754 | orchestrator | 2025-09-27 01:49:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:26.433033 | orchestrator | 2025-09-27 01:49:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:26.433059 | orchestrator | 2025-09-27 01:49:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:29.476373 | orchestrator | 2025-09-27 01:49:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:29.478471 | orchestrator | 2025-09-27 01:49:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:29.478812 | orchestrator | 2025-09-27 01:49:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:32.527117 | orchestrator | 2025-09-27 01:49:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:32.528189 | orchestrator | 2025-09-27 01:49:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:32.528213 | orchestrator | 2025-09-27 01:49:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:35.570615 | orchestrator | 2025-09-27 01:49:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:35.572412 | orchestrator | 2025-09-27 01:49:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:35.572434 | orchestrator | 2025-09-27 01:49:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:38.615426 | orchestrator | 2025-09-27 01:49:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:38.617838 | orchestrator | 2025-09-27 01:49:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:38.617865 | orchestrator | 2025-09-27 01:49:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:41.669493 | orchestrator | 2025-09-27 01:49:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:41.672065 | orchestrator | 2025-09-27 01:49:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:41.672086 | orchestrator | 2025-09-27 01:49:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:44.725117 | orchestrator | 2025-09-27 01:49:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:44.726577 | orchestrator | 2025-09-27 01:49:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:44.726953 | orchestrator | 2025-09-27 01:49:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:47.770377 | orchestrator | 2025-09-27 01:49:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:47.772832 | orchestrator | 2025-09-27 01:49:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:47.773215 | orchestrator | 2025-09-27 01:49:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:50.822628 | orchestrator | 2025-09-27 01:49:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:50.825303 | orchestrator | 2025-09-27 01:49:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:50.825373 | orchestrator | 2025-09-27 01:49:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:53.871133 | orchestrator | 2025-09-27 01:49:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:53.873051 | orchestrator | 2025-09-27 01:49:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:53.873095 | orchestrator | 2025-09-27 01:49:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:56.919685 | orchestrator | 2025-09-27 01:49:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:56.921198 | orchestrator | 2025-09-27 01:49:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:56.921342 | orchestrator | 2025-09-27 01:49:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:49:59.968662 | orchestrator | 2025-09-27 01:49:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:49:59.970699 | orchestrator | 2025-09-27 01:49:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:49:59.970998 | orchestrator | 2025-09-27 01:49:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:03.018565 | orchestrator | 2025-09-27 01:50:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:03.020630 | orchestrator | 2025-09-27 01:50:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:03.021149 | orchestrator | 2025-09-27 01:50:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:06.066878 | orchestrator | 2025-09-27 01:50:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:06.068580 | orchestrator | 2025-09-27 01:50:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:06.068741 | orchestrator | 2025-09-27 01:50:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:09.113221 | orchestrator | 2025-09-27 01:50:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:09.114692 | orchestrator | 2025-09-27 01:50:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:09.114742 | orchestrator | 2025-09-27 01:50:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:12.162156 | orchestrator | 2025-09-27 01:50:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:12.163531 | orchestrator | 2025-09-27 01:50:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:12.163558 | orchestrator | 2025-09-27 01:50:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:15.210831 | orchestrator | 2025-09-27 01:50:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:15.214479 | orchestrator | 2025-09-27 01:50:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:15.214513 | orchestrator | 2025-09-27 01:50:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:18.263574 | orchestrator | 2025-09-27 01:50:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:18.265000 | orchestrator | 2025-09-27 01:50:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:18.265026 | orchestrator | 2025-09-27 01:50:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:21.304207 | orchestrator | 2025-09-27 01:50:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:21.305456 | orchestrator | 2025-09-27 01:50:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:21.305481 | orchestrator | 2025-09-27 01:50:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:24.351939 | orchestrator | 2025-09-27 01:50:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:24.353182 | orchestrator | 2025-09-27 01:50:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:24.353234 | orchestrator | 2025-09-27 01:50:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:27.402969 | orchestrator | 2025-09-27 01:50:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:27.405311 | orchestrator | 2025-09-27 01:50:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:27.405343 | orchestrator | 2025-09-27 01:50:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:30.458492 | orchestrator | 2025-09-27 01:50:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:30.461941 | orchestrator | 2025-09-27 01:50:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:30.462393 | orchestrator | 2025-09-27 01:50:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:33.509661 | orchestrator | 2025-09-27 01:50:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:33.510283 | orchestrator | 2025-09-27 01:50:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:33.510316 | orchestrator | 2025-09-27 01:50:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:36.554359 | orchestrator | 2025-09-27 01:50:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:36.556697 | orchestrator | 2025-09-27 01:50:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:36.557094 | orchestrator | 2025-09-27 01:50:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:39.598966 | orchestrator | 2025-09-27 01:50:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:39.600017 | orchestrator | 2025-09-27 01:50:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:39.600373 | orchestrator | 2025-09-27 01:50:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:42.653204 | orchestrator | 2025-09-27 01:50:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:42.654429 | orchestrator | 2025-09-27 01:50:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:42.654463 | orchestrator | 2025-09-27 01:50:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:45.704576 | orchestrator | 2025-09-27 01:50:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:45.705619 | orchestrator | 2025-09-27 01:50:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:45.705874 | orchestrator | 2025-09-27 01:50:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:48.752194 | orchestrator | 2025-09-27 01:50:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:48.754205 | orchestrator | 2025-09-27 01:50:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:48.754236 | orchestrator | 2025-09-27 01:50:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:51.800233 | orchestrator | 2025-09-27 01:50:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:51.802897 | orchestrator | 2025-09-27 01:50:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:51.802968 | orchestrator | 2025-09-27 01:50:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:54.849956 | orchestrator | 2025-09-27 01:50:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:54.851241 | orchestrator | 2025-09-27 01:50:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:54.851434 | orchestrator | 2025-09-27 01:50:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:50:57.898242 | orchestrator | 2025-09-27 01:50:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:50:57.901356 | orchestrator | 2025-09-27 01:50:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:50:57.901557 | orchestrator | 2025-09-27 01:50:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:00.948922 | orchestrator | 2025-09-27 01:51:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:00.951458 | orchestrator | 2025-09-27 01:51:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:00.951486 | orchestrator | 2025-09-27 01:51:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:04.008656 | orchestrator | 2025-09-27 01:51:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:04.011972 | orchestrator | 2025-09-27 01:51:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:04.012019 | orchestrator | 2025-09-27 01:51:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:07.057969 | orchestrator | 2025-09-27 01:51:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:07.060379 | orchestrator | 2025-09-27 01:51:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:07.060525 | orchestrator | 2025-09-27 01:51:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:10.116926 | orchestrator | 2025-09-27 01:51:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:10.116999 | orchestrator | 2025-09-27 01:51:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:10.117012 | orchestrator | 2025-09-27 01:51:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:13.168366 | orchestrator | 2025-09-27 01:51:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:13.168462 | orchestrator | 2025-09-27 01:51:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:13.168477 | orchestrator | 2025-09-27 01:51:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:16.212140 | orchestrator | 2025-09-27 01:51:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:16.213426 | orchestrator | 2025-09-27 01:51:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:16.213454 | orchestrator | 2025-09-27 01:51:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:19.266840 | orchestrator | 2025-09-27 01:51:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:19.267720 | orchestrator | 2025-09-27 01:51:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:19.267993 | orchestrator | 2025-09-27 01:51:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:22.316294 | orchestrator | 2025-09-27 01:51:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:22.317883 | orchestrator | 2025-09-27 01:51:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:22.318158 | orchestrator | 2025-09-27 01:51:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:25.369412 | orchestrator | 2025-09-27 01:51:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:25.370829 | orchestrator | 2025-09-27 01:51:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:25.370857 | orchestrator | 2025-09-27 01:51:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:28.414641 | orchestrator | 2025-09-27 01:51:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:28.415694 | orchestrator | 2025-09-27 01:51:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:28.415724 | orchestrator | 2025-09-27 01:51:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:31.465236 | orchestrator | 2025-09-27 01:51:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:31.466725 | orchestrator | 2025-09-27 01:51:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:31.466752 | orchestrator | 2025-09-27 01:51:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:34.524945 | orchestrator | 2025-09-27 01:51:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:34.529953 | orchestrator | 2025-09-27 01:51:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:34.529984 | orchestrator | 2025-09-27 01:51:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:37.581480 | orchestrator | 2025-09-27 01:51:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:37.582330 | orchestrator | 2025-09-27 01:51:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:37.582358 | orchestrator | 2025-09-27 01:51:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:40.637444 | orchestrator | 2025-09-27 01:51:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:40.639289 | orchestrator | 2025-09-27 01:51:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:40.639319 | orchestrator | 2025-09-27 01:51:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:43.699116 | orchestrator | 2025-09-27 01:51:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:43.701681 | orchestrator | 2025-09-27 01:51:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:43.702273 | orchestrator | 2025-09-27 01:51:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:46.754510 | orchestrator | 2025-09-27 01:51:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:46.755339 | orchestrator | 2025-09-27 01:51:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:46.755365 | orchestrator | 2025-09-27 01:51:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:49.818279 | orchestrator | 2025-09-27 01:51:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:49.820179 | orchestrator | 2025-09-27 01:51:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:49.820205 | orchestrator | 2025-09-27 01:51:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:52.882762 | orchestrator | 2025-09-27 01:51:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:52.883483 | orchestrator | 2025-09-27 01:51:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:52.883518 | orchestrator | 2025-09-27 01:51:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:55.933301 | orchestrator | 2025-09-27 01:51:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:55.935107 | orchestrator | 2025-09-27 01:51:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:55.935140 | orchestrator | 2025-09-27 01:51:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:51:58.984018 | orchestrator | 2025-09-27 01:51:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:51:58.986271 | orchestrator | 2025-09-27 01:51:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:51:58.986703 | orchestrator | 2025-09-27 01:51:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:02.033152 | orchestrator | 2025-09-27 01:52:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:02.034620 | orchestrator | 2025-09-27 01:52:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:02.034951 | orchestrator | 2025-09-27 01:52:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:05.089347 | orchestrator | 2025-09-27 01:52:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:05.090434 | orchestrator | 2025-09-27 01:52:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:05.090461 | orchestrator | 2025-09-27 01:52:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:08.132520 | orchestrator | 2025-09-27 01:52:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:08.134497 | orchestrator | 2025-09-27 01:52:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:08.134525 | orchestrator | 2025-09-27 01:52:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:11.181708 | orchestrator | 2025-09-27 01:52:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:11.184010 | orchestrator | 2025-09-27 01:52:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:11.184048 | orchestrator | 2025-09-27 01:52:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:14.235089 | orchestrator | 2025-09-27 01:52:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:14.236873 | orchestrator | 2025-09-27 01:52:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:14.236908 | orchestrator | 2025-09-27 01:52:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:17.285002 | orchestrator | 2025-09-27 01:52:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:17.286690 | orchestrator | 2025-09-27 01:52:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:17.286715 | orchestrator | 2025-09-27 01:52:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:20.336139 | orchestrator | 2025-09-27 01:52:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:20.337428 | orchestrator | 2025-09-27 01:52:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:20.337705 | orchestrator | 2025-09-27 01:52:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:23.382895 | orchestrator | 2025-09-27 01:52:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:23.385043 | orchestrator | 2025-09-27 01:52:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:23.385076 | orchestrator | 2025-09-27 01:52:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:26.433403 | orchestrator | 2025-09-27 01:52:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:26.434286 | orchestrator | 2025-09-27 01:52:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:26.434339 | orchestrator | 2025-09-27 01:52:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:29.482471 | orchestrator | 2025-09-27 01:52:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:29.484236 | orchestrator | 2025-09-27 01:52:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:29.484267 | orchestrator | 2025-09-27 01:52:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:32.531706 | orchestrator | 2025-09-27 01:52:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:32.532854 | orchestrator | 2025-09-27 01:52:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:32.532885 | orchestrator | 2025-09-27 01:52:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:35.576649 | orchestrator | 2025-09-27 01:52:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:35.578766 | orchestrator | 2025-09-27 01:52:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:35.578868 | orchestrator | 2025-09-27 01:52:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:38.628411 | orchestrator | 2025-09-27 01:52:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:38.630455 | orchestrator | 2025-09-27 01:52:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:38.630489 | orchestrator | 2025-09-27 01:52:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:41.680716 | orchestrator | 2025-09-27 01:52:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:41.683881 | orchestrator | 2025-09-27 01:52:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:41.683953 | orchestrator | 2025-09-27 01:52:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:44.728950 | orchestrator | 2025-09-27 01:52:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:44.731058 | orchestrator | 2025-09-27 01:52:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:44.731170 | orchestrator | 2025-09-27 01:52:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:47.774269 | orchestrator | 2025-09-27 01:52:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:47.775990 | orchestrator | 2025-09-27 01:52:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:47.776028 | orchestrator | 2025-09-27 01:52:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:50.826721 | orchestrator | 2025-09-27 01:52:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:50.828743 | orchestrator | 2025-09-27 01:52:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:50.828849 | orchestrator | 2025-09-27 01:52:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:53.877063 | orchestrator | 2025-09-27 01:52:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:53.879037 | orchestrator | 2025-09-27 01:52:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:53.879153 | orchestrator | 2025-09-27 01:52:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:56.930104 | orchestrator | 2025-09-27 01:52:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:56.932152 | orchestrator | 2025-09-27 01:52:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:56.932397 | orchestrator | 2025-09-27 01:52:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:52:59.978857 | orchestrator | 2025-09-27 01:52:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:52:59.981497 | orchestrator | 2025-09-27 01:52:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:52:59.981541 | orchestrator | 2025-09-27 01:52:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:03.029433 | orchestrator | 2025-09-27 01:53:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:03.030602 | orchestrator | 2025-09-27 01:53:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:03.030656 | orchestrator | 2025-09-27 01:53:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:06.085228 | orchestrator | 2025-09-27 01:53:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:06.086253 | orchestrator | 2025-09-27 01:53:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:06.086279 | orchestrator | 2025-09-27 01:53:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:09.134116 | orchestrator | 2025-09-27 01:53:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:09.135747 | orchestrator | 2025-09-27 01:53:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:09.135831 | orchestrator | 2025-09-27 01:53:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:12.181528 | orchestrator | 2025-09-27 01:53:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:12.182268 | orchestrator | 2025-09-27 01:53:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:12.182298 | orchestrator | 2025-09-27 01:53:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:15.232353 | orchestrator | 2025-09-27 01:53:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:15.233279 | orchestrator | 2025-09-27 01:53:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:15.233345 | orchestrator | 2025-09-27 01:53:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:18.279753 | orchestrator | 2025-09-27 01:53:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:18.281080 | orchestrator | 2025-09-27 01:53:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:18.281155 | orchestrator | 2025-09-27 01:53:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:21.320035 | orchestrator | 2025-09-27 01:53:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:21.321657 | orchestrator | 2025-09-27 01:53:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:21.322009 | orchestrator | 2025-09-27 01:53:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:24.370153 | orchestrator | 2025-09-27 01:53:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:24.371834 | orchestrator | 2025-09-27 01:53:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:24.372096 | orchestrator | 2025-09-27 01:53:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:27.418317 | orchestrator | 2025-09-27 01:53:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:27.419323 | orchestrator | 2025-09-27 01:53:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:27.419418 | orchestrator | 2025-09-27 01:53:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:30.466193 | orchestrator | 2025-09-27 01:53:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:30.467962 | orchestrator | 2025-09-27 01:53:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:30.468052 | orchestrator | 2025-09-27 01:53:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:33.513211 | orchestrator | 2025-09-27 01:53:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:33.514541 | orchestrator | 2025-09-27 01:53:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:33.514566 | orchestrator | 2025-09-27 01:53:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:36.564059 | orchestrator | 2025-09-27 01:53:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:36.565716 | orchestrator | 2025-09-27 01:53:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:36.565745 | orchestrator | 2025-09-27 01:53:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:39.613915 | orchestrator | 2025-09-27 01:53:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:39.616448 | orchestrator | 2025-09-27 01:53:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:39.616476 | orchestrator | 2025-09-27 01:53:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:42.662606 | orchestrator | 2025-09-27 01:53:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:42.664213 | orchestrator | 2025-09-27 01:53:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:42.664298 | orchestrator | 2025-09-27 01:53:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:45.711910 | orchestrator | 2025-09-27 01:53:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:45.713894 | orchestrator | 2025-09-27 01:53:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:45.713923 | orchestrator | 2025-09-27 01:53:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:48.756093 | orchestrator | 2025-09-27 01:53:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:48.757680 | orchestrator | 2025-09-27 01:53:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:48.757715 | orchestrator | 2025-09-27 01:53:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:51.804482 | orchestrator | 2025-09-27 01:53:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:51.805909 | orchestrator | 2025-09-27 01:53:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:51.805936 | orchestrator | 2025-09-27 01:53:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:54.850490 | orchestrator | 2025-09-27 01:53:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:54.851995 | orchestrator | 2025-09-27 01:53:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:54.852052 | orchestrator | 2025-09-27 01:53:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:53:57.898845 | orchestrator | 2025-09-27 01:53:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:53:57.901505 | orchestrator | 2025-09-27 01:53:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:53:57.901824 | orchestrator | 2025-09-27 01:53:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:00.942211 | orchestrator | 2025-09-27 01:54:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:00.942996 | orchestrator | 2025-09-27 01:54:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:00.943029 | orchestrator | 2025-09-27 01:54:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:03.994320 | orchestrator | 2025-09-27 01:54:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:03.995973 | orchestrator | 2025-09-27 01:54:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:03.996967 | orchestrator | 2025-09-27 01:54:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:07.045511 | orchestrator | 2025-09-27 01:54:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:07.049017 | orchestrator | 2025-09-27 01:54:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:07.049430 | orchestrator | 2025-09-27 01:54:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:10.096002 | orchestrator | 2025-09-27 01:54:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:10.098319 | orchestrator | 2025-09-27 01:54:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:10.098374 | orchestrator | 2025-09-27 01:54:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:13.151157 | orchestrator | 2025-09-27 01:54:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:13.152552 | orchestrator | 2025-09-27 01:54:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:13.152592 | orchestrator | 2025-09-27 01:54:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:16.199047 | orchestrator | 2025-09-27 01:54:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:16.201308 | orchestrator | 2025-09-27 01:54:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:16.201363 | orchestrator | 2025-09-27 01:54:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:19.247496 | orchestrator | 2025-09-27 01:54:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:19.249683 | orchestrator | 2025-09-27 01:54:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:19.249712 | orchestrator | 2025-09-27 01:54:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:22.299604 | orchestrator | 2025-09-27 01:54:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:22.301712 | orchestrator | 2025-09-27 01:54:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:22.301745 | orchestrator | 2025-09-27 01:54:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:25.346711 | orchestrator | 2025-09-27 01:54:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:25.347719 | orchestrator | 2025-09-27 01:54:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:25.348366 | orchestrator | 2025-09-27 01:54:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:28.393269 | orchestrator | 2025-09-27 01:54:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:28.395441 | orchestrator | 2025-09-27 01:54:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:28.395577 | orchestrator | 2025-09-27 01:54:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:31.438012 | orchestrator | 2025-09-27 01:54:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:31.439817 | orchestrator | 2025-09-27 01:54:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:31.439909 | orchestrator | 2025-09-27 01:54:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:34.480622 | orchestrator | 2025-09-27 01:54:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:34.481398 | orchestrator | 2025-09-27 01:54:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:34.481424 | orchestrator | 2025-09-27 01:54:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:37.535075 | orchestrator | 2025-09-27 01:54:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:37.536940 | orchestrator | 2025-09-27 01:54:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:37.537471 | orchestrator | 2025-09-27 01:54:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:40.578478 | orchestrator | 2025-09-27 01:54:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:40.579924 | orchestrator | 2025-09-27 01:54:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:40.580256 | orchestrator | 2025-09-27 01:54:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:43.627070 | orchestrator | 2025-09-27 01:54:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:43.629184 | orchestrator | 2025-09-27 01:54:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:43.629227 | orchestrator | 2025-09-27 01:54:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:46.677179 | orchestrator | 2025-09-27 01:54:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:46.678502 | orchestrator | 2025-09-27 01:54:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:46.678538 | orchestrator | 2025-09-27 01:54:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:49.722413 | orchestrator | 2025-09-27 01:54:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:49.724384 | orchestrator | 2025-09-27 01:54:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:49.724504 | orchestrator | 2025-09-27 01:54:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:52.781527 | orchestrator | 2025-09-27 01:54:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:52.783447 | orchestrator | 2025-09-27 01:54:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:52.783479 | orchestrator | 2025-09-27 01:54:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:55.829564 | orchestrator | 2025-09-27 01:54:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:55.830551 | orchestrator | 2025-09-27 01:54:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:55.830609 | orchestrator | 2025-09-27 01:54:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:54:58.875439 | orchestrator | 2025-09-27 01:54:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:54:58.877983 | orchestrator | 2025-09-27 01:54:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:54:58.878076 | orchestrator | 2025-09-27 01:54:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:01.917397 | orchestrator | 2025-09-27 01:55:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:01.918236 | orchestrator | 2025-09-27 01:55:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:01.918356 | orchestrator | 2025-09-27 01:55:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:04.964671 | orchestrator | 2025-09-27 01:55:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:04.966472 | orchestrator | 2025-09-27 01:55:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:04.966503 | orchestrator | 2025-09-27 01:55:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:08.010980 | orchestrator | 2025-09-27 01:55:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:08.012667 | orchestrator | 2025-09-27 01:55:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:08.012693 | orchestrator | 2025-09-27 01:55:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:11.055798 | orchestrator | 2025-09-27 01:55:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:11.058418 | orchestrator | 2025-09-27 01:55:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:11.058480 | orchestrator | 2025-09-27 01:55:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:14.101906 | orchestrator | 2025-09-27 01:55:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:14.103351 | orchestrator | 2025-09-27 01:55:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:14.103456 | orchestrator | 2025-09-27 01:55:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:17.151017 | orchestrator | 2025-09-27 01:55:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:17.153355 | orchestrator | 2025-09-27 01:55:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:17.153383 | orchestrator | 2025-09-27 01:55:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:20.199588 | orchestrator | 2025-09-27 01:55:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:20.200249 | orchestrator | 2025-09-27 01:55:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:20.200279 | orchestrator | 2025-09-27 01:55:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:23.244933 | orchestrator | 2025-09-27 01:55:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:23.245006 | orchestrator | 2025-09-27 01:55:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:23.245048 | orchestrator | 2025-09-27 01:55:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:26.289267 | orchestrator | 2025-09-27 01:55:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:26.290811 | orchestrator | 2025-09-27 01:55:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:26.290840 | orchestrator | 2025-09-27 01:55:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:29.338551 | orchestrator | 2025-09-27 01:55:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:29.340363 | orchestrator | 2025-09-27 01:55:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:29.340585 | orchestrator | 2025-09-27 01:55:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:32.386247 | orchestrator | 2025-09-27 01:55:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:32.388584 | orchestrator | 2025-09-27 01:55:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:32.388633 | orchestrator | 2025-09-27 01:55:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:35.430387 | orchestrator | 2025-09-27 01:55:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:35.432270 | orchestrator | 2025-09-27 01:55:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:35.432402 | orchestrator | 2025-09-27 01:55:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:38.480897 | orchestrator | 2025-09-27 01:55:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:38.482256 | orchestrator | 2025-09-27 01:55:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:38.482286 | orchestrator | 2025-09-27 01:55:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:41.540155 | orchestrator | 2025-09-27 01:55:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:41.541181 | orchestrator | 2025-09-27 01:55:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:41.541212 | orchestrator | 2025-09-27 01:55:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:44.605596 | orchestrator | 2025-09-27 01:55:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:44.605670 | orchestrator | 2025-09-27 01:55:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:44.605683 | orchestrator | 2025-09-27 01:55:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:47.666685 | orchestrator | 2025-09-27 01:55:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:47.669152 | orchestrator | 2025-09-27 01:55:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:47.669234 | orchestrator | 2025-09-27 01:55:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:50.730269 | orchestrator | 2025-09-27 01:55:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:50.731884 | orchestrator | 2025-09-27 01:55:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:50.731914 | orchestrator | 2025-09-27 01:55:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:53.781638 | orchestrator | 2025-09-27 01:55:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:53.783412 | orchestrator | 2025-09-27 01:55:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:53.783444 | orchestrator | 2025-09-27 01:55:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:56.830189 | orchestrator | 2025-09-27 01:55:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:56.833860 | orchestrator | 2025-09-27 01:55:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:56.833958 | orchestrator | 2025-09-27 01:55:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:55:59.883942 | orchestrator | 2025-09-27 01:55:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:55:59.885689 | orchestrator | 2025-09-27 01:55:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:55:59.885801 | orchestrator | 2025-09-27 01:55:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:02.929195 | orchestrator | 2025-09-27 01:56:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:02.930546 | orchestrator | 2025-09-27 01:56:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:02.930616 | orchestrator | 2025-09-27 01:56:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:05.977161 | orchestrator | 2025-09-27 01:56:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:05.979074 | orchestrator | 2025-09-27 01:56:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:05.979123 | orchestrator | 2025-09-27 01:56:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:09.027739 | orchestrator | 2025-09-27 01:56:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:09.030589 | orchestrator | 2025-09-27 01:56:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:09.030636 | orchestrator | 2025-09-27 01:56:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:12.070947 | orchestrator | 2025-09-27 01:56:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:12.073578 | orchestrator | 2025-09-27 01:56:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:12.073606 | orchestrator | 2025-09-27 01:56:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:15.120537 | orchestrator | 2025-09-27 01:56:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:15.122201 | orchestrator | 2025-09-27 01:56:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:15.122564 | orchestrator | 2025-09-27 01:56:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:18.167131 | orchestrator | 2025-09-27 01:56:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:18.169086 | orchestrator | 2025-09-27 01:56:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:18.169135 | orchestrator | 2025-09-27 01:56:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:21.213637 | orchestrator | 2025-09-27 01:56:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:21.214789 | orchestrator | 2025-09-27 01:56:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:21.214865 | orchestrator | 2025-09-27 01:56:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:24.263597 | orchestrator | 2025-09-27 01:56:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:24.266601 | orchestrator | 2025-09-27 01:56:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:24.266625 | orchestrator | 2025-09-27 01:56:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:27.307308 | orchestrator | 2025-09-27 01:56:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:27.309109 | orchestrator | 2025-09-27 01:56:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:27.309295 | orchestrator | 2025-09-27 01:56:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:30.351465 | orchestrator | 2025-09-27 01:56:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:30.353361 | orchestrator | 2025-09-27 01:56:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:30.353407 | orchestrator | 2025-09-27 01:56:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:33.400148 | orchestrator | 2025-09-27 01:56:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:33.402276 | orchestrator | 2025-09-27 01:56:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:33.402351 | orchestrator | 2025-09-27 01:56:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:36.450748 | orchestrator | 2025-09-27 01:56:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:36.452676 | orchestrator | 2025-09-27 01:56:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:36.452753 | orchestrator | 2025-09-27 01:56:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:39.502888 | orchestrator | 2025-09-27 01:56:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:39.506586 | orchestrator | 2025-09-27 01:56:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:39.506620 | orchestrator | 2025-09-27 01:56:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:42.555321 | orchestrator | 2025-09-27 01:56:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:42.556617 | orchestrator | 2025-09-27 01:56:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:42.556690 | orchestrator | 2025-09-27 01:56:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:45.606505 | orchestrator | 2025-09-27 01:56:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:45.608384 | orchestrator | 2025-09-27 01:56:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:45.608419 | orchestrator | 2025-09-27 01:56:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:48.656891 | orchestrator | 2025-09-27 01:56:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:48.658503 | orchestrator | 2025-09-27 01:56:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:48.658530 | orchestrator | 2025-09-27 01:56:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:51.702721 | orchestrator | 2025-09-27 01:56:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:51.704383 | orchestrator | 2025-09-27 01:56:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:51.704438 | orchestrator | 2025-09-27 01:56:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:54.751071 | orchestrator | 2025-09-27 01:56:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:54.752671 | orchestrator | 2025-09-27 01:56:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:54.752696 | orchestrator | 2025-09-27 01:56:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:56:57.804530 | orchestrator | 2025-09-27 01:56:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:56:57.806324 | orchestrator | 2025-09-27 01:56:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:56:57.806351 | orchestrator | 2025-09-27 01:56:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:00.856464 | orchestrator | 2025-09-27 01:57:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:00.857974 | orchestrator | 2025-09-27 01:57:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:00.858180 | orchestrator | 2025-09-27 01:57:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:03.902955 | orchestrator | 2025-09-27 01:57:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:03.904327 | orchestrator | 2025-09-27 01:57:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:03.904366 | orchestrator | 2025-09-27 01:57:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:06.950240 | orchestrator | 2025-09-27 01:57:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:06.952628 | orchestrator | 2025-09-27 01:57:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:06.953814 | orchestrator | 2025-09-27 01:57:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:10.004524 | orchestrator | 2025-09-27 01:57:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:10.006695 | orchestrator | 2025-09-27 01:57:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:10.006723 | orchestrator | 2025-09-27 01:57:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:13.057335 | orchestrator | 2025-09-27 01:57:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:13.058493 | orchestrator | 2025-09-27 01:57:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:13.058728 | orchestrator | 2025-09-27 01:57:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:16.107271 | orchestrator | 2025-09-27 01:57:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:16.109169 | orchestrator | 2025-09-27 01:57:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:16.109259 | orchestrator | 2025-09-27 01:57:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:19.155134 | orchestrator | 2025-09-27 01:57:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:19.157451 | orchestrator | 2025-09-27 01:57:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:19.157485 | orchestrator | 2025-09-27 01:57:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:22.207506 | orchestrator | 2025-09-27 01:57:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:22.209002 | orchestrator | 2025-09-27 01:57:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:22.209073 | orchestrator | 2025-09-27 01:57:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:25.249137 | orchestrator | 2025-09-27 01:57:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:25.250843 | orchestrator | 2025-09-27 01:57:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:25.251842 | orchestrator | 2025-09-27 01:57:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:28.291911 | orchestrator | 2025-09-27 01:57:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:28.294362 | orchestrator | 2025-09-27 01:57:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:28.294399 | orchestrator | 2025-09-27 01:57:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:31.339446 | orchestrator | 2025-09-27 01:57:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:31.341202 | orchestrator | 2025-09-27 01:57:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:31.341311 | orchestrator | 2025-09-27 01:57:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:34.385040 | orchestrator | 2025-09-27 01:57:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:34.387219 | orchestrator | 2025-09-27 01:57:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:34.387314 | orchestrator | 2025-09-27 01:57:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:37.423573 | orchestrator | 2025-09-27 01:57:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:37.425057 | orchestrator | 2025-09-27 01:57:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:37.425092 | orchestrator | 2025-09-27 01:57:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:40.476482 | orchestrator | 2025-09-27 01:57:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:40.479372 | orchestrator | 2025-09-27 01:57:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:40.479404 | orchestrator | 2025-09-27 01:57:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:43.527680 | orchestrator | 2025-09-27 01:57:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:43.530306 | orchestrator | 2025-09-27 01:57:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:43.530337 | orchestrator | 2025-09-27 01:57:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:46.579278 | orchestrator | 2025-09-27 01:57:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:46.581639 | orchestrator | 2025-09-27 01:57:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:46.581681 | orchestrator | 2025-09-27 01:57:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:49.627719 | orchestrator | 2025-09-27 01:57:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:49.629454 | orchestrator | 2025-09-27 01:57:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:49.629483 | orchestrator | 2025-09-27 01:57:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:52.676196 | orchestrator | 2025-09-27 01:57:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:52.678105 | orchestrator | 2025-09-27 01:57:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:52.678132 | orchestrator | 2025-09-27 01:57:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:55.726200 | orchestrator | 2025-09-27 01:57:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:55.726572 | orchestrator | 2025-09-27 01:57:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:55.726598 | orchestrator | 2025-09-27 01:57:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:57:58.773573 | orchestrator | 2025-09-27 01:57:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:57:58.776036 | orchestrator | 2025-09-27 01:57:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:57:58.776068 | orchestrator | 2025-09-27 01:57:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:01.825688 | orchestrator | 2025-09-27 01:58:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:01.827696 | orchestrator | 2025-09-27 01:58:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:01.827729 | orchestrator | 2025-09-27 01:58:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:04.875026 | orchestrator | 2025-09-27 01:58:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:04.876171 | orchestrator | 2025-09-27 01:58:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:04.876338 | orchestrator | 2025-09-27 01:58:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:07.924620 | orchestrator | 2025-09-27 01:58:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:07.926493 | orchestrator | 2025-09-27 01:58:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:07.926554 | orchestrator | 2025-09-27 01:58:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:10.971692 | orchestrator | 2025-09-27 01:58:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:10.973375 | orchestrator | 2025-09-27 01:58:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:10.973406 | orchestrator | 2025-09-27 01:58:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:14.019231 | orchestrator | 2025-09-27 01:58:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:14.020716 | orchestrator | 2025-09-27 01:58:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:14.020759 | orchestrator | 2025-09-27 01:58:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:17.063519 | orchestrator | 2025-09-27 01:58:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:17.064466 | orchestrator | 2025-09-27 01:58:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:17.064499 | orchestrator | 2025-09-27 01:58:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:20.109693 | orchestrator | 2025-09-27 01:58:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:20.111757 | orchestrator | 2025-09-27 01:58:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:20.111864 | orchestrator | 2025-09-27 01:58:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:23.157520 | orchestrator | 2025-09-27 01:58:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:23.158673 | orchestrator | 2025-09-27 01:58:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:23.158699 | orchestrator | 2025-09-27 01:58:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:26.209034 | orchestrator | 2025-09-27 01:58:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:26.212641 | orchestrator | 2025-09-27 01:58:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:26.212729 | orchestrator | 2025-09-27 01:58:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:29.260080 | orchestrator | 2025-09-27 01:58:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:29.261406 | orchestrator | 2025-09-27 01:58:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:29.261536 | orchestrator | 2025-09-27 01:58:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:32.318939 | orchestrator | 2025-09-27 01:58:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:32.322120 | orchestrator | 2025-09-27 01:58:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:32.322156 | orchestrator | 2025-09-27 01:58:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:35.373163 | orchestrator | 2025-09-27 01:58:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:35.375040 | orchestrator | 2025-09-27 01:58:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:35.375075 | orchestrator | 2025-09-27 01:58:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:38.426392 | orchestrator | 2025-09-27 01:58:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:38.428299 | orchestrator | 2025-09-27 01:58:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:38.428752 | orchestrator | 2025-09-27 01:58:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:41.477215 | orchestrator | 2025-09-27 01:58:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:41.481861 | orchestrator | 2025-09-27 01:58:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:41.481897 | orchestrator | 2025-09-27 01:58:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:44.534951 | orchestrator | 2025-09-27 01:58:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:44.538385 | orchestrator | 2025-09-27 01:58:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:44.538457 | orchestrator | 2025-09-27 01:58:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:47.585047 | orchestrator | 2025-09-27 01:58:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:47.586081 | orchestrator | 2025-09-27 01:58:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:47.586199 | orchestrator | 2025-09-27 01:58:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:50.632379 | orchestrator | 2025-09-27 01:58:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:50.634301 | orchestrator | 2025-09-27 01:58:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:50.634328 | orchestrator | 2025-09-27 01:58:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:53.683906 | orchestrator | 2025-09-27 01:58:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:53.685350 | orchestrator | 2025-09-27 01:58:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:53.685374 | orchestrator | 2025-09-27 01:58:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:56.727296 | orchestrator | 2025-09-27 01:58:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:56.728360 | orchestrator | 2025-09-27 01:58:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:56.728386 | orchestrator | 2025-09-27 01:58:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:58:59.771161 | orchestrator | 2025-09-27 01:58:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:58:59.772509 | orchestrator | 2025-09-27 01:58:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:58:59.772535 | orchestrator | 2025-09-27 01:58:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:02.821484 | orchestrator | 2025-09-27 01:59:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:02.822635 | orchestrator | 2025-09-27 01:59:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:02.822671 | orchestrator | 2025-09-27 01:59:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:05.871047 | orchestrator | 2025-09-27 01:59:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:05.872399 | orchestrator | 2025-09-27 01:59:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:05.872654 | orchestrator | 2025-09-27 01:59:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:08.917608 | orchestrator | 2025-09-27 01:59:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:08.918899 | orchestrator | 2025-09-27 01:59:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:08.918922 | orchestrator | 2025-09-27 01:59:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:11.966183 | orchestrator | 2025-09-27 01:59:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:11.967432 | orchestrator | 2025-09-27 01:59:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:11.967475 | orchestrator | 2025-09-27 01:59:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:15.012743 | orchestrator | 2025-09-27 01:59:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:15.014622 | orchestrator | 2025-09-27 01:59:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:15.014667 | orchestrator | 2025-09-27 01:59:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:18.059040 | orchestrator | 2025-09-27 01:59:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:18.060136 | orchestrator | 2025-09-27 01:59:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:18.060423 | orchestrator | 2025-09-27 01:59:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:21.108421 | orchestrator | 2025-09-27 01:59:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:21.109945 | orchestrator | 2025-09-27 01:59:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:21.109972 | orchestrator | 2025-09-27 01:59:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:24.161526 | orchestrator | 2025-09-27 01:59:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:24.162802 | orchestrator | 2025-09-27 01:59:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:24.162850 | orchestrator | 2025-09-27 01:59:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:27.204921 | orchestrator | 2025-09-27 01:59:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:27.206464 | orchestrator | 2025-09-27 01:59:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:27.206493 | orchestrator | 2025-09-27 01:59:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:30.250708 | orchestrator | 2025-09-27 01:59:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:30.253508 | orchestrator | 2025-09-27 01:59:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:30.253539 | orchestrator | 2025-09-27 01:59:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:33.301305 | orchestrator | 2025-09-27 01:59:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:33.302713 | orchestrator | 2025-09-27 01:59:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:33.302806 | orchestrator | 2025-09-27 01:59:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:36.345170 | orchestrator | 2025-09-27 01:59:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:36.347009 | orchestrator | 2025-09-27 01:59:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:36.347106 | orchestrator | 2025-09-27 01:59:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:39.396617 | orchestrator | 2025-09-27 01:59:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:39.397998 | orchestrator | 2025-09-27 01:59:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:39.398071 | orchestrator | 2025-09-27 01:59:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:42.447044 | orchestrator | 2025-09-27 01:59:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:42.448599 | orchestrator | 2025-09-27 01:59:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:42.448674 | orchestrator | 2025-09-27 01:59:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:45.492903 | orchestrator | 2025-09-27 01:59:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:45.495074 | orchestrator | 2025-09-27 01:59:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:45.495104 | orchestrator | 2025-09-27 01:59:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:48.542373 | orchestrator | 2025-09-27 01:59:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:48.544051 | orchestrator | 2025-09-27 01:59:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:48.544105 | orchestrator | 2025-09-27 01:59:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:51.587896 | orchestrator | 2025-09-27 01:59:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:51.589474 | orchestrator | 2025-09-27 01:59:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:51.589514 | orchestrator | 2025-09-27 01:59:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:54.637807 | orchestrator | 2025-09-27 01:59:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:54.639475 | orchestrator | 2025-09-27 01:59:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:54.639502 | orchestrator | 2025-09-27 01:59:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 01:59:57.685365 | orchestrator | 2025-09-27 01:59:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 01:59:57.686331 | orchestrator | 2025-09-27 01:59:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 01:59:57.686405 | orchestrator | 2025-09-27 01:59:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:00.721816 | orchestrator | 2025-09-27 02:00:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:00.724074 | orchestrator | 2025-09-27 02:00:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:00.724109 | orchestrator | 2025-09-27 02:00:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:03.761290 | orchestrator | 2025-09-27 02:00:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:03.762605 | orchestrator | 2025-09-27 02:00:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:03.762835 | orchestrator | 2025-09-27 02:00:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:06.808802 | orchestrator | 2025-09-27 02:00:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:06.810915 | orchestrator | 2025-09-27 02:00:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:06.810950 | orchestrator | 2025-09-27 02:00:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:09.860219 | orchestrator | 2025-09-27 02:00:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:09.862057 | orchestrator | 2025-09-27 02:00:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:09.862136 | orchestrator | 2025-09-27 02:00:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:12.903946 | orchestrator | 2025-09-27 02:00:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:12.905475 | orchestrator | 2025-09-27 02:00:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:12.905855 | orchestrator | 2025-09-27 02:00:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:15.952348 | orchestrator | 2025-09-27 02:00:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:15.953986 | orchestrator | 2025-09-27 02:00:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:15.954109 | orchestrator | 2025-09-27 02:00:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:19.007619 | orchestrator | 2025-09-27 02:00:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:19.010267 | orchestrator | 2025-09-27 02:00:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:19.010323 | orchestrator | 2025-09-27 02:00:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:22.058187 | orchestrator | 2025-09-27 02:00:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:22.060413 | orchestrator | 2025-09-27 02:00:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:22.061141 | orchestrator | 2025-09-27 02:00:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:25.098121 | orchestrator | 2025-09-27 02:00:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:25.099445 | orchestrator | 2025-09-27 02:00:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:25.099473 | orchestrator | 2025-09-27 02:00:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:28.146314 | orchestrator | 2025-09-27 02:00:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:28.148659 | orchestrator | 2025-09-27 02:00:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:28.148689 | orchestrator | 2025-09-27 02:00:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:31.195217 | orchestrator | 2025-09-27 02:00:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:31.196979 | orchestrator | 2025-09-27 02:00:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:31.197009 | orchestrator | 2025-09-27 02:00:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:34.247162 | orchestrator | 2025-09-27 02:00:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:34.248754 | orchestrator | 2025-09-27 02:00:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:34.248833 | orchestrator | 2025-09-27 02:00:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:37.295727 | orchestrator | 2025-09-27 02:00:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:37.298575 | orchestrator | 2025-09-27 02:00:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:37.298813 | orchestrator | 2025-09-27 02:00:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:40.342151 | orchestrator | 2025-09-27 02:00:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:40.344220 | orchestrator | 2025-09-27 02:00:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:40.344405 | orchestrator | 2025-09-27 02:00:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:43.384103 | orchestrator | 2025-09-27 02:00:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:43.385774 | orchestrator | 2025-09-27 02:00:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:43.386150 | orchestrator | 2025-09-27 02:00:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:46.434926 | orchestrator | 2025-09-27 02:00:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:46.437855 | orchestrator | 2025-09-27 02:00:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:46.437912 | orchestrator | 2025-09-27 02:00:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:49.491493 | orchestrator | 2025-09-27 02:00:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:49.492390 | orchestrator | 2025-09-27 02:00:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:49.492556 | orchestrator | 2025-09-27 02:00:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:52.543055 | orchestrator | 2025-09-27 02:00:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:52.544636 | orchestrator | 2025-09-27 02:00:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:52.544780 | orchestrator | 2025-09-27 02:00:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:55.593809 | orchestrator | 2025-09-27 02:00:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:55.595991 | orchestrator | 2025-09-27 02:00:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:55.596112 | orchestrator | 2025-09-27 02:00:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:00:58.649572 | orchestrator | 2025-09-27 02:00:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:00:58.652266 | orchestrator | 2025-09-27 02:00:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:00:58.652298 | orchestrator | 2025-09-27 02:00:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:01.700884 | orchestrator | 2025-09-27 02:01:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:01.702993 | orchestrator | 2025-09-27 02:01:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:01.703605 | orchestrator | 2025-09-27 02:01:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:04.751136 | orchestrator | 2025-09-27 02:01:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:04.752650 | orchestrator | 2025-09-27 02:01:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:04.752687 | orchestrator | 2025-09-27 02:01:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:07.794547 | orchestrator | 2025-09-27 02:01:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:07.796517 | orchestrator | 2025-09-27 02:01:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:07.796552 | orchestrator | 2025-09-27 02:01:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:10.848155 | orchestrator | 2025-09-27 02:01:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:10.849236 | orchestrator | 2025-09-27 02:01:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:10.849422 | orchestrator | 2025-09-27 02:01:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:13.894589 | orchestrator | 2025-09-27 02:01:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:13.894689 | orchestrator | 2025-09-27 02:01:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:13.894705 | orchestrator | 2025-09-27 02:01:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:16.942358 | orchestrator | 2025-09-27 02:01:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:16.943900 | orchestrator | 2025-09-27 02:01:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:16.943970 | orchestrator | 2025-09-27 02:01:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:19.988865 | orchestrator | 2025-09-27 02:01:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:19.990284 | orchestrator | 2025-09-27 02:01:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:19.990324 | orchestrator | 2025-09-27 02:01:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:23.038846 | orchestrator | 2025-09-27 02:01:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:23.040244 | orchestrator | 2025-09-27 02:01:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:23.040323 | orchestrator | 2025-09-27 02:01:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:26.082260 | orchestrator | 2025-09-27 02:01:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:26.084005 | orchestrator | 2025-09-27 02:01:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:26.084151 | orchestrator | 2025-09-27 02:01:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:29.129440 | orchestrator | 2025-09-27 02:01:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:29.130868 | orchestrator | 2025-09-27 02:01:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:29.131033 | orchestrator | 2025-09-27 02:01:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:32.183196 | orchestrator | 2025-09-27 02:01:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:32.185120 | orchestrator | 2025-09-27 02:01:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:32.185148 | orchestrator | 2025-09-27 02:01:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:35.230748 | orchestrator | 2025-09-27 02:01:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:35.232109 | orchestrator | 2025-09-27 02:01:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:35.232124 | orchestrator | 2025-09-27 02:01:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:38.278312 | orchestrator | 2025-09-27 02:01:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:38.280303 | orchestrator | 2025-09-27 02:01:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:38.280382 | orchestrator | 2025-09-27 02:01:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:41.320181 | orchestrator | 2025-09-27 02:01:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:41.323191 | orchestrator | 2025-09-27 02:01:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:41.323223 | orchestrator | 2025-09-27 02:01:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:44.367842 | orchestrator | 2025-09-27 02:01:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:44.369522 | orchestrator | 2025-09-27 02:01:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:44.369553 | orchestrator | 2025-09-27 02:01:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:47.419463 | orchestrator | 2025-09-27 02:01:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:47.421462 | orchestrator | 2025-09-27 02:01:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:47.421523 | orchestrator | 2025-09-27 02:01:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:50.468265 | orchestrator | 2025-09-27 02:01:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:50.471403 | orchestrator | 2025-09-27 02:01:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:50.471430 | orchestrator | 2025-09-27 02:01:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:53.520230 | orchestrator | 2025-09-27 02:01:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:53.521154 | orchestrator | 2025-09-27 02:01:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:53.521174 | orchestrator | 2025-09-27 02:01:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:56.568115 | orchestrator | 2025-09-27 02:01:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:56.569581 | orchestrator | 2025-09-27 02:01:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:56.569624 | orchestrator | 2025-09-27 02:01:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:01:59.615140 | orchestrator | 2025-09-27 02:01:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:01:59.617444 | orchestrator | 2025-09-27 02:01:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:01:59.617478 | orchestrator | 2025-09-27 02:01:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:02.666065 | orchestrator | 2025-09-27 02:02:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:02.668188 | orchestrator | 2025-09-27 02:02:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:02.668221 | orchestrator | 2025-09-27 02:02:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:05.715340 | orchestrator | 2025-09-27 02:02:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:05.717563 | orchestrator | 2025-09-27 02:02:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:05.717594 | orchestrator | 2025-09-27 02:02:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:08.763174 | orchestrator | 2025-09-27 02:02:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:08.763885 | orchestrator | 2025-09-27 02:02:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:08.763919 | orchestrator | 2025-09-27 02:02:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:11.804885 | orchestrator | 2025-09-27 02:02:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:11.806190 | orchestrator | 2025-09-27 02:02:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:11.806580 | orchestrator | 2025-09-27 02:02:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:14.855498 | orchestrator | 2025-09-27 02:02:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:14.857253 | orchestrator | 2025-09-27 02:02:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:14.857336 | orchestrator | 2025-09-27 02:02:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:17.906876 | orchestrator | 2025-09-27 02:02:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:17.908312 | orchestrator | 2025-09-27 02:02:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:17.908347 | orchestrator | 2025-09-27 02:02:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:20.960545 | orchestrator | 2025-09-27 02:02:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:20.961919 | orchestrator | 2025-09-27 02:02:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:20.961992 | orchestrator | 2025-09-27 02:02:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:24.017461 | orchestrator | 2025-09-27 02:02:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:24.017552 | orchestrator | 2025-09-27 02:02:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:24.017566 | orchestrator | 2025-09-27 02:02:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:27.061278 | orchestrator | 2025-09-27 02:02:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:27.063298 | orchestrator | 2025-09-27 02:02:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:27.063334 | orchestrator | 2025-09-27 02:02:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:30.104776 | orchestrator | 2025-09-27 02:02:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:30.106234 | orchestrator | 2025-09-27 02:02:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:30.106310 | orchestrator | 2025-09-27 02:02:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:33.169347 | orchestrator | 2025-09-27 02:02:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:33.169473 | orchestrator | 2025-09-27 02:02:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:33.169499 | orchestrator | 2025-09-27 02:02:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:36.210323 | orchestrator | 2025-09-27 02:02:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:36.212558 | orchestrator | 2025-09-27 02:02:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:36.212589 | orchestrator | 2025-09-27 02:02:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:39.276221 | orchestrator | 2025-09-27 02:02:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:39.277848 | orchestrator | 2025-09-27 02:02:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:39.277878 | orchestrator | 2025-09-27 02:02:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:42.325271 | orchestrator | 2025-09-27 02:02:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:42.327290 | orchestrator | 2025-09-27 02:02:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:42.327573 | orchestrator | 2025-09-27 02:02:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:45.378484 | orchestrator | 2025-09-27 02:02:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:45.380775 | orchestrator | 2025-09-27 02:02:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:45.380810 | orchestrator | 2025-09-27 02:02:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:48.426856 | orchestrator | 2025-09-27 02:02:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:48.428123 | orchestrator | 2025-09-27 02:02:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:48.428155 | orchestrator | 2025-09-27 02:02:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:51.491265 | orchestrator | 2025-09-27 02:02:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:51.492658 | orchestrator | 2025-09-27 02:02:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:51.492699 | orchestrator | 2025-09-27 02:02:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:54.549872 | orchestrator | 2025-09-27 02:02:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:54.551720 | orchestrator | 2025-09-27 02:02:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:54.551764 | orchestrator | 2025-09-27 02:02:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:02:57.588140 | orchestrator | 2025-09-27 02:02:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:02:57.589349 | orchestrator | 2025-09-27 02:02:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:02:57.589414 | orchestrator | 2025-09-27 02:02:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:00.633867 | orchestrator | 2025-09-27 02:03:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:00.635487 | orchestrator | 2025-09-27 02:03:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:00.635877 | orchestrator | 2025-09-27 02:03:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:03.682197 | orchestrator | 2025-09-27 02:03:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:03.684829 | orchestrator | 2025-09-27 02:03:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:03.685746 | orchestrator | 2025-09-27 02:03:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:06.736200 | orchestrator | 2025-09-27 02:03:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:06.737522 | orchestrator | 2025-09-27 02:03:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:06.737906 | orchestrator | 2025-09-27 02:03:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:09.781126 | orchestrator | 2025-09-27 02:03:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:09.783745 | orchestrator | 2025-09-27 02:03:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:09.783814 | orchestrator | 2025-09-27 02:03:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:12.826909 | orchestrator | 2025-09-27 02:03:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:12.829240 | orchestrator | 2025-09-27 02:03:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:12.829598 | orchestrator | 2025-09-27 02:03:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:15.876949 | orchestrator | 2025-09-27 02:03:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:15.878183 | orchestrator | 2025-09-27 02:03:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:15.878226 | orchestrator | 2025-09-27 02:03:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:18.924445 | orchestrator | 2025-09-27 02:03:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:18.925761 | orchestrator | 2025-09-27 02:03:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:18.925833 | orchestrator | 2025-09-27 02:03:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:21.972314 | orchestrator | 2025-09-27 02:03:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:21.973832 | orchestrator | 2025-09-27 02:03:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:21.974871 | orchestrator | 2025-09-27 02:03:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:25.030466 | orchestrator | 2025-09-27 02:03:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:25.031282 | orchestrator | 2025-09-27 02:03:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:25.031336 | orchestrator | 2025-09-27 02:03:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:28.072314 | orchestrator | 2025-09-27 02:03:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:28.074445 | orchestrator | 2025-09-27 02:03:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:28.074475 | orchestrator | 2025-09-27 02:03:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:31.122868 | orchestrator | 2025-09-27 02:03:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:31.125003 | orchestrator | 2025-09-27 02:03:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:31.125076 | orchestrator | 2025-09-27 02:03:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:34.168265 | orchestrator | 2025-09-27 02:03:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:34.170079 | orchestrator | 2025-09-27 02:03:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:34.170116 | orchestrator | 2025-09-27 02:03:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:37.213105 | orchestrator | 2025-09-27 02:03:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:37.214947 | orchestrator | 2025-09-27 02:03:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:37.215055 | orchestrator | 2025-09-27 02:03:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:40.265359 | orchestrator | 2025-09-27 02:03:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:40.267615 | orchestrator | 2025-09-27 02:03:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:40.267652 | orchestrator | 2025-09-27 02:03:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:43.323079 | orchestrator | 2025-09-27 02:03:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:43.325257 | orchestrator | 2025-09-27 02:03:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:43.325289 | orchestrator | 2025-09-27 02:03:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:46.380856 | orchestrator | 2025-09-27 02:03:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:46.383036 | orchestrator | 2025-09-27 02:03:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:46.383329 | orchestrator | 2025-09-27 02:03:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:49.434803 | orchestrator | 2025-09-27 02:03:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:49.436213 | orchestrator | 2025-09-27 02:03:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:49.436284 | orchestrator | 2025-09-27 02:03:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:52.483632 | orchestrator | 2025-09-27 02:03:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:52.484108 | orchestrator | 2025-09-27 02:03:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:52.484141 | orchestrator | 2025-09-27 02:03:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:55.528382 | orchestrator | 2025-09-27 02:03:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:55.529809 | orchestrator | 2025-09-27 02:03:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:55.529840 | orchestrator | 2025-09-27 02:03:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:03:58.566086 | orchestrator | 2025-09-27 02:03:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:03:58.567572 | orchestrator | 2025-09-27 02:03:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:03:58.567600 | orchestrator | 2025-09-27 02:03:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:01.613395 | orchestrator | 2025-09-27 02:04:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:01.614885 | orchestrator | 2025-09-27 02:04:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:01.615100 | orchestrator | 2025-09-27 02:04:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:04.663434 | orchestrator | 2025-09-27 02:04:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:04.664850 | orchestrator | 2025-09-27 02:04:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:04.664877 | orchestrator | 2025-09-27 02:04:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:07.713316 | orchestrator | 2025-09-27 02:04:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:07.714650 | orchestrator | 2025-09-27 02:04:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:07.714714 | orchestrator | 2025-09-27 02:04:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:10.768417 | orchestrator | 2025-09-27 02:04:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:10.770652 | orchestrator | 2025-09-27 02:04:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:10.770684 | orchestrator | 2025-09-27 02:04:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:13.815921 | orchestrator | 2025-09-27 02:04:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:13.816925 | orchestrator | 2025-09-27 02:04:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:13.816958 | orchestrator | 2025-09-27 02:04:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:16.865478 | orchestrator | 2025-09-27 02:04:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:16.868004 | orchestrator | 2025-09-27 02:04:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:16.868098 | orchestrator | 2025-09-27 02:04:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:19.915469 | orchestrator | 2025-09-27 02:04:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:19.917236 | orchestrator | 2025-09-27 02:04:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:19.917519 | orchestrator | 2025-09-27 02:04:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:22.960363 | orchestrator | 2025-09-27 02:04:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:22.961835 | orchestrator | 2025-09-27 02:04:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:22.961865 | orchestrator | 2025-09-27 02:04:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:26.009004 | orchestrator | 2025-09-27 02:04:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:26.009928 | orchestrator | 2025-09-27 02:04:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:26.009975 | orchestrator | 2025-09-27 02:04:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:29.054404 | orchestrator | 2025-09-27 02:04:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:29.055524 | orchestrator | 2025-09-27 02:04:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:29.055604 | orchestrator | 2025-09-27 02:04:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:32.102831 | orchestrator | 2025-09-27 02:04:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:32.104186 | orchestrator | 2025-09-27 02:04:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:32.104216 | orchestrator | 2025-09-27 02:04:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:35.147003 | orchestrator | 2025-09-27 02:04:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:35.148716 | orchestrator | 2025-09-27 02:04:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:35.148761 | orchestrator | 2025-09-27 02:04:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:38.188855 | orchestrator | 2025-09-27 02:04:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:38.190653 | orchestrator | 2025-09-27 02:04:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:38.190970 | orchestrator | 2025-09-27 02:04:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:41.244107 | orchestrator | 2025-09-27 02:04:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:41.245574 | orchestrator | 2025-09-27 02:04:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:41.245648 | orchestrator | 2025-09-27 02:04:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:44.294175 | orchestrator | 2025-09-27 02:04:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:44.295360 | orchestrator | 2025-09-27 02:04:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:44.295420 | orchestrator | 2025-09-27 02:04:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:47.335968 | orchestrator | 2025-09-27 02:04:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:47.337847 | orchestrator | 2025-09-27 02:04:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:47.337872 | orchestrator | 2025-09-27 02:04:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:50.381563 | orchestrator | 2025-09-27 02:04:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:50.383104 | orchestrator | 2025-09-27 02:04:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:50.383205 | orchestrator | 2025-09-27 02:04:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:53.426277 | orchestrator | 2025-09-27 02:04:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:53.426824 | orchestrator | 2025-09-27 02:04:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:53.426854 | orchestrator | 2025-09-27 02:04:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:56.472795 | orchestrator | 2025-09-27 02:04:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:56.474202 | orchestrator | 2025-09-27 02:04:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:56.474239 | orchestrator | 2025-09-27 02:04:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:04:59.517589 | orchestrator | 2025-09-27 02:04:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:04:59.518636 | orchestrator | 2025-09-27 02:04:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:04:59.518710 | orchestrator | 2025-09-27 02:04:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:02.561553 | orchestrator | 2025-09-27 02:05:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:02.563721 | orchestrator | 2025-09-27 02:05:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:02.563754 | orchestrator | 2025-09-27 02:05:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:05.605745 | orchestrator | 2025-09-27 02:05:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:05.606159 | orchestrator | 2025-09-27 02:05:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:05.606198 | orchestrator | 2025-09-27 02:05:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:08.644752 | orchestrator | 2025-09-27 02:05:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:08.646443 | orchestrator | 2025-09-27 02:05:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:08.646473 | orchestrator | 2025-09-27 02:05:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:11.689256 | orchestrator | 2025-09-27 02:05:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:11.691174 | orchestrator | 2025-09-27 02:05:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:11.691212 | orchestrator | 2025-09-27 02:05:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:14.737467 | orchestrator | 2025-09-27 02:05:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:14.739156 | orchestrator | 2025-09-27 02:05:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:14.739279 | orchestrator | 2025-09-27 02:05:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:17.790464 | orchestrator | 2025-09-27 02:05:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:17.792812 | orchestrator | 2025-09-27 02:05:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:17.792960 | orchestrator | 2025-09-27 02:05:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:20.838568 | orchestrator | 2025-09-27 02:05:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:20.840672 | orchestrator | 2025-09-27 02:05:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:20.840934 | orchestrator | 2025-09-27 02:05:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:23.883587 | orchestrator | 2025-09-27 02:05:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:23.886159 | orchestrator | 2025-09-27 02:05:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:23.886236 | orchestrator | 2025-09-27 02:05:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:26.935246 | orchestrator | 2025-09-27 02:05:26 | INFO [0m | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:26.936571 | orchestrator | 2025-09-27 02:05:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:26.936606 | orchestrator | 2025-09-27 02:05:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:29.977144 | orchestrator | 2025-09-27 02:05:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:29.979224 | orchestrator | 2025-09-27 02:05:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:29.979257 | orchestrator | 2025-09-27 02:05:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:33.027643 | orchestrator | 2025-09-27 02:05:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:33.028604 | orchestrator | 2025-09-27 02:05:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:33.028745 | orchestrator | 2025-09-27 02:05:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:36.067370 | orchestrator | 2025-09-27 02:05:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:36.069250 | orchestrator | 2025-09-27 02:05:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:36.069339 | orchestrator | 2025-09-27 02:05:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:39.113810 | orchestrator | 2025-09-27 02:05:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:39.115701 | orchestrator | 2025-09-27 02:05:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:39.115734 | orchestrator | 2025-09-27 02:05:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:42.166238 | orchestrator | 2025-09-27 02:05:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:42.167647 | orchestrator | 2025-09-27 02:05:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:42.167725 | orchestrator | 2025-09-27 02:05:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:45.211785 | orchestrator | 2025-09-27 02:05:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:45.213127 | orchestrator | 2025-09-27 02:05:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:45.213165 | orchestrator | 2025-09-27 02:05:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:48.259578 | orchestrator | 2025-09-27 02:05:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:48.261897 | orchestrator | 2025-09-27 02:05:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:48.261930 | orchestrator | 2025-09-27 02:05:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:51.306358 | orchestrator | 2025-09-27 02:05:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:51.308036 | orchestrator | 2025-09-27 02:05:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:51.308102 | orchestrator | 2025-09-27 02:05:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:54.348956 | orchestrator | 2025-09-27 02:05:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:54.350148 | orchestrator | 2025-09-27 02:05:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:54.350185 | orchestrator | 2025-09-27 02:05:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:05:57.392324 | orchestrator | 2025-09-27 02:05:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:05:57.394452 | orchestrator | 2025-09-27 02:05:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:05:57.394616 | orchestrator | 2025-09-27 02:05:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:00.438090 | orchestrator | 2025-09-27 02:06:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:00.441979 | orchestrator | 2025-09-27 02:06:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:00.442325 | orchestrator | 2025-09-27 02:06:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:03.493662 | orchestrator | 2025-09-27 02:06:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:03.495118 | orchestrator | 2025-09-27 02:06:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:03.495172 | orchestrator | 2025-09-27 02:06:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:06.545195 | orchestrator | 2025-09-27 02:06:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:06.548425 | orchestrator | 2025-09-27 02:06:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:06.548454 | orchestrator | 2025-09-27 02:06:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:09.602854 | orchestrator | 2025-09-27 02:06:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:09.604584 | orchestrator | 2025-09-27 02:06:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:09.604663 | orchestrator | 2025-09-27 02:06:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:12.651261 | orchestrator | 2025-09-27 02:06:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:12.652247 | orchestrator | 2025-09-27 02:06:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:12.652309 | orchestrator | 2025-09-27 02:06:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:15.702888 | orchestrator | 2025-09-27 02:06:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:15.705218 | orchestrator | 2025-09-27 02:06:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:15.705251 | orchestrator | 2025-09-27 02:06:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:18.753999 | orchestrator | 2025-09-27 02:06:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:18.755929 | orchestrator | 2025-09-27 02:06:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:18.755960 | orchestrator | 2025-09-27 02:06:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:21.799709 | orchestrator | 2025-09-27 02:06:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:21.802083 | orchestrator | 2025-09-27 02:06:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:21.802175 | orchestrator | 2025-09-27 02:06:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:24.848254 | orchestrator | 2025-09-27 02:06:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:24.849895 | orchestrator | 2025-09-27 02:06:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:24.849927 | orchestrator | 2025-09-27 02:06:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:27.897245 | orchestrator | 2025-09-27 02:06:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:27.897330 | orchestrator | 2025-09-27 02:06:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:27.897406 | orchestrator | 2025-09-27 02:06:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:30.938508 | orchestrator | 2025-09-27 02:06:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:30.939793 | orchestrator | 2025-09-27 02:06:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:30.939825 | orchestrator | 2025-09-27 02:06:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:33.992592 | orchestrator | 2025-09-27 02:06:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:33.994226 | orchestrator | 2025-09-27 02:06:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:33.994518 | orchestrator | 2025-09-27 02:06:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:37.048292 | orchestrator | 2025-09-27 02:06:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:37.048772 | orchestrator | 2025-09-27 02:06:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:37.048818 | orchestrator | 2025-09-27 02:06:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:40.098098 | orchestrator | 2025-09-27 02:06:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:40.099494 | orchestrator | 2025-09-27 02:06:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:40.099531 | orchestrator | 2025-09-27 02:06:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:43.149672 | orchestrator | 2025-09-27 02:06:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:43.151000 | orchestrator | 2025-09-27 02:06:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:43.151036 | orchestrator | 2025-09-27 02:06:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:46.202842 | orchestrator | 2025-09-27 02:06:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:46.204204 | orchestrator | 2025-09-27 02:06:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:46.204420 | orchestrator | 2025-09-27 02:06:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:49.260010 | orchestrator | 2025-09-27 02:06:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:49.261296 | orchestrator | 2025-09-27 02:06:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:49.261513 | orchestrator | 2025-09-27 02:06:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:52.329060 | orchestrator | 2025-09-27 02:06:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:52.330594 | orchestrator | 2025-09-27 02:06:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:52.330634 | orchestrator | 2025-09-27 02:06:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:55.385045 | orchestrator | 2025-09-27 02:06:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:55.387279 | orchestrator | 2025-09-27 02:06:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:55.387321 | orchestrator | 2025-09-27 02:06:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:06:58.458681 | orchestrator | 2025-09-27 02:06:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:06:58.459900 | orchestrator | 2025-09-27 02:06:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:06:58.460773 | orchestrator | 2025-09-27 02:06:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:01.511221 | orchestrator | 2025-09-27 02:07:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:01.511826 | orchestrator | 2025-09-27 02:07:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:01.511858 | orchestrator | 2025-09-27 02:07:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:04.560053 | orchestrator | 2025-09-27 02:07:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:04.561311 | orchestrator | 2025-09-27 02:07:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:04.561407 | orchestrator | 2025-09-27 02:07:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:07.606978 | orchestrator | 2025-09-27 02:07:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:07.607814 | orchestrator | 2025-09-27 02:07:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:07.607936 | orchestrator | 2025-09-27 02:07:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:10.657749 | orchestrator | 2025-09-27 02:07:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:10.659209 | orchestrator | 2025-09-27 02:07:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:10.659417 | orchestrator | 2025-09-27 02:07:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:13.700864 | orchestrator | 2025-09-27 02:07:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:13.703237 | orchestrator | 2025-09-27 02:07:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:13.703273 | orchestrator | 2025-09-27 02:07:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:16.750623 | orchestrator | 2025-09-27 02:07:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:16.752211 | orchestrator | 2025-09-27 02:07:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:16.752326 | orchestrator | 2025-09-27 02:07:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:19.798966 | orchestrator | 2025-09-27 02:07:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:19.799716 | orchestrator | 2025-09-27 02:07:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:19.799746 | orchestrator | 2025-09-27 02:07:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:22.847060 | orchestrator | 2025-09-27 02:07:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:22.849399 | orchestrator | 2025-09-27 02:07:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:22.849430 | orchestrator | 2025-09-27 02:07:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:25.899205 | orchestrator | 2025-09-27 02:07:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:25.900452 | orchestrator | 2025-09-27 02:07:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:25.901116 | orchestrator | 2025-09-27 02:07:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:28.948967 | orchestrator | 2025-09-27 02:07:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:28.951231 | orchestrator | 2025-09-27 02:07:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:28.951263 | orchestrator | 2025-09-27 02:07:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:32.000012 | orchestrator | 2025-09-27 02:07:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:32.002230 | orchestrator | 2025-09-27 02:07:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:32.002264 | orchestrator | 2025-09-27 02:07:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:35.044643 | orchestrator | 2025-09-27 02:07:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:35.047145 | orchestrator | 2025-09-27 02:07:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:35.047184 | orchestrator | 2025-09-27 02:07:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:38.093338 | orchestrator | 2025-09-27 02:07:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:38.094449 | orchestrator | 2025-09-27 02:07:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:38.094502 | orchestrator | 2025-09-27 02:07:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:41.139603 | orchestrator | 2025-09-27 02:07:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:41.140821 | orchestrator | 2025-09-27 02:07:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:41.140882 | orchestrator | 2025-09-27 02:07:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:44.189869 | orchestrator | 2025-09-27 02:07:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:44.190798 | orchestrator | 2025-09-27 02:07:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:44.191203 | orchestrator | 2025-09-27 02:07:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:47.242012 | orchestrator | 2025-09-27 02:07:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:47.243386 | orchestrator | 2025-09-27 02:07:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:47.243445 | orchestrator | 2025-09-27 02:07:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:50.293819 | orchestrator | 2025-09-27 02:07:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:50.294235 | orchestrator | 2025-09-27 02:07:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:50.294687 | orchestrator | 2025-09-27 02:07:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:53.345765 | orchestrator | 2025-09-27 02:07:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:53.347170 | orchestrator | 2025-09-27 02:07:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:53.347295 | orchestrator | 2025-09-27 02:07:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:56.395508 | orchestrator | 2025-09-27 02:07:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:56.396849 | orchestrator | 2025-09-27 02:07:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:56.397043 | orchestrator | 2025-09-27 02:07:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:07:59.446703 | orchestrator | 2025-09-27 02:07:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:07:59.448753 | orchestrator | 2025-09-27 02:07:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:07:59.448791 | orchestrator | 2025-09-27 02:07:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:02.494704 | orchestrator | 2025-09-27 02:08:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:02.495396 | orchestrator | 2025-09-27 02:08:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:02.495487 | orchestrator | 2025-09-27 02:08:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:05.539302 | orchestrator | 2025-09-27 02:08:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:05.541315 | orchestrator | 2025-09-27 02:08:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:05.541732 | orchestrator | 2025-09-27 02:08:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:08.588722 | orchestrator | 2025-09-27 02:08:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:08.590359 | orchestrator | 2025-09-27 02:08:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:08.590396 | orchestrator | 2025-09-27 02:08:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:11.639663 | orchestrator | 2025-09-27 02:08:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:11.642841 | orchestrator | 2025-09-27 02:08:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:11.642893 | orchestrator | 2025-09-27 02:08:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:14.693933 | orchestrator | 2025-09-27 02:08:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:14.695529 | orchestrator | 2025-09-27 02:08:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:14.695566 | orchestrator | 2025-09-27 02:08:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:17.742658 | orchestrator | 2025-09-27 02:08:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:17.743077 | orchestrator | 2025-09-27 02:08:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:17.743140 | orchestrator | 2025-09-27 02:08:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:20.790958 | orchestrator | 2025-09-27 02:08:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:20.792899 | orchestrator | 2025-09-27 02:08:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:20.793198 | orchestrator | 2025-09-27 02:08:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:23.838826 | orchestrator | 2025-09-27 02:08:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:23.839584 | orchestrator | 2025-09-27 02:08:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:23.839788 | orchestrator | 2025-09-27 02:08:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:26.888003 | orchestrator | 2025-09-27 02:08:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:26.890518 | orchestrator | 2025-09-27 02:08:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:26.890553 | orchestrator | 2025-09-27 02:08:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:29.938231 | orchestrator | 2025-09-27 02:08:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:29.939664 | orchestrator | 2025-09-27 02:08:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:29.939696 | orchestrator | 2025-09-27 02:08:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:32.984306 | orchestrator | 2025-09-27 02:08:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:32.986192 | orchestrator | 2025-09-27 02:08:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:32.986456 | orchestrator | 2025-09-27 02:08:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:36.029833 | orchestrator | 2025-09-27 02:08:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:36.030246 | orchestrator | 2025-09-27 02:08:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:36.030280 | orchestrator | 2025-09-27 02:08:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:39.074564 | orchestrator | 2025-09-27 02:08:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:39.074669 | orchestrator | 2025-09-27 02:08:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:39.074743 | orchestrator | 2025-09-27 02:08:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:42.112467 | orchestrator | 2025-09-27 02:08:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:42.114268 | orchestrator | 2025-09-27 02:08:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:42.114309 | orchestrator | 2025-09-27 02:08:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:45.147147 | orchestrator | 2025-09-27 02:08:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:45.148428 | orchestrator | 2025-09-27 02:08:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:45.148464 | orchestrator | 2025-09-27 02:08:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:48.189449 | orchestrator | 2025-09-27 02:08:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:48.190187 | orchestrator | 2025-09-27 02:08:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:48.190222 | orchestrator | 2025-09-27 02:08:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:51.234308 | orchestrator | 2025-09-27 02:08:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:51.237008 | orchestrator | 2025-09-27 02:08:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:51.237043 | orchestrator | 2025-09-27 02:08:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:54.274303 | orchestrator | 2025-09-27 02:08:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:54.275623 | orchestrator | 2025-09-27 02:08:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:54.275704 | orchestrator | 2025-09-27 02:08:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:08:57.321470 | orchestrator | 2025-09-27 02:08:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:08:57.323098 | orchestrator | 2025-09-27 02:08:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:08:57.323289 | orchestrator | 2025-09-27 02:08:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:00.367523 | orchestrator | 2025-09-27 02:09:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:00.369430 | orchestrator | 2025-09-27 02:09:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:00.369464 | orchestrator | 2025-09-27 02:09:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:03.407847 | orchestrator | 2025-09-27 02:09:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:03.409646 | orchestrator | 2025-09-27 02:09:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:03.409680 | orchestrator | 2025-09-27 02:09:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:06.449086 | orchestrator | 2025-09-27 02:09:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:06.450144 | orchestrator | 2025-09-27 02:09:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:06.450266 | orchestrator | 2025-09-27 02:09:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:09.495359 | orchestrator | 2025-09-27 02:09:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:09.496749 | orchestrator | 2025-09-27 02:09:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:09.497008 | orchestrator | 2025-09-27 02:09:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:12.529531 | orchestrator | 2025-09-27 02:09:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:12.530460 | orchestrator | 2025-09-27 02:09:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:12.530652 | orchestrator | 2025-09-27 02:09:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:15.582628 | orchestrator | 2025-09-27 02:09:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:15.583811 | orchestrator | 2025-09-27 02:09:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:15.583840 | orchestrator | 2025-09-27 02:09:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:18.635345 | orchestrator | 2025-09-27 02:09:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:18.636541 | orchestrator | 2025-09-27 02:09:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:18.636614 | orchestrator | 2025-09-27 02:09:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:21.683942 | orchestrator | 2025-09-27 02:09:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:21.685849 | orchestrator | 2025-09-27 02:09:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:21.685878 | orchestrator | 2025-09-27 02:09:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:24.729942 | orchestrator | 2025-09-27 02:09:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:24.734523 | orchestrator | 2025-09-27 02:09:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:24.734558 | orchestrator | 2025-09-27 02:09:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:27.777609 | orchestrator | 2025-09-27 02:09:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:27.781314 | orchestrator | 2025-09-27 02:09:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:27.781389 | orchestrator | 2025-09-27 02:09:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:30.831825 | orchestrator | 2025-09-27 02:09:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:30.833582 | orchestrator | 2025-09-27 02:09:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:30.833657 | orchestrator | 2025-09-27 02:09:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:33.884094 | orchestrator | 2025-09-27 02:09:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:33.885187 | orchestrator | 2025-09-27 02:09:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:33.885311 | orchestrator | 2025-09-27 02:09:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:36.934381 | orchestrator | 2025-09-27 02:09:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:36.936194 | orchestrator | 2025-09-27 02:09:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:36.936232 | orchestrator | 2025-09-27 02:09:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:39.986840 | orchestrator | 2025-09-27 02:09:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:39.988304 | orchestrator | 2025-09-27 02:09:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:39.988564 | orchestrator | 2025-09-27 02:09:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:43.037941 | orchestrator | 2025-09-27 02:09:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:43.040686 | orchestrator | 2025-09-27 02:09:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:43.040731 | orchestrator | 2025-09-27 02:09:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:46.091442 | orchestrator | 2025-09-27 02:09:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:46.093686 | orchestrator | 2025-09-27 02:09:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:46.093795 | orchestrator | 2025-09-27 02:09:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:49.138931 | orchestrator | 2025-09-27 02:09:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:49.140552 | orchestrator | 2025-09-27 02:09:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:49.140584 | orchestrator | 2025-09-27 02:09:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:52.190550 | orchestrator | 2025-09-27 02:09:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:52.193282 | orchestrator | 2025-09-27 02:09:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:52.193560 | orchestrator | 2025-09-27 02:09:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:55.244741 | orchestrator | 2025-09-27 02:09:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:55.246322 | orchestrator | 2025-09-27 02:09:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:55.246346 | orchestrator | 2025-09-27 02:09:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:09:58.292661 | orchestrator | 2025-09-27 02:09:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:09:58.294547 | orchestrator | 2025-09-27 02:09:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:09:58.294578 | orchestrator | 2025-09-27 02:09:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:01.340742 | orchestrator | 2025-09-27 02:10:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:01.343230 | orchestrator | 2025-09-27 02:10:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:01.343405 | orchestrator | 2025-09-27 02:10:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:04.388763 | orchestrator | 2025-09-27 02:10:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:04.390318 | orchestrator | 2025-09-27 02:10:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:04.390402 | orchestrator | 2025-09-27 02:10:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:07.435788 | orchestrator | 2025-09-27 02:10:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:07.437685 | orchestrator | 2025-09-27 02:10:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:07.437718 | orchestrator | 2025-09-27 02:10:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:10.484727 | orchestrator | 2025-09-27 02:10:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:10.487192 | orchestrator | 2025-09-27 02:10:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:10.487226 | orchestrator | 2025-09-27 02:10:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:13.541435 | orchestrator | 2025-09-27 02:10:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:13.544848 | orchestrator | 2025-09-27 02:10:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:13.544876 | orchestrator | 2025-09-27 02:10:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:16.596177 | orchestrator | 2025-09-27 02:10:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:16.598236 | orchestrator | 2025-09-27 02:10:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:16.598320 | orchestrator | 2025-09-27 02:10:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:19.649479 | orchestrator | 2025-09-27 02:10:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:19.652752 | orchestrator | 2025-09-27 02:10:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:19.653067 | orchestrator | 2025-09-27 02:10:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:22.705874 | orchestrator | 2025-09-27 02:10:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:22.709039 | orchestrator | 2025-09-27 02:10:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:22.709114 | orchestrator | 2025-09-27 02:10:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:25.759294 | orchestrator | 2025-09-27 02:10:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:25.760454 | orchestrator | 2025-09-27 02:10:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:25.760486 | orchestrator | 2025-09-27 02:10:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:28.806065 | orchestrator | 2025-09-27 02:10:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:28.807333 | orchestrator | 2025-09-27 02:10:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:28.807440 | orchestrator | 2025-09-27 02:10:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:31.856079 | orchestrator | 2025-09-27 02:10:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:31.856793 | orchestrator | 2025-09-27 02:10:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:31.856994 | orchestrator | 2025-09-27 02:10:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:34.911329 | orchestrator | 2025-09-27 02:10:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:34.914627 | orchestrator | 2025-09-27 02:10:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:34.914791 | orchestrator | 2025-09-27 02:10:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:37.965741 | orchestrator | 2025-09-27 02:10:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:37.968614 | orchestrator | 2025-09-27 02:10:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:37.968673 | orchestrator | 2025-09-27 02:10:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:41.010279 | orchestrator | 2025-09-27 02:10:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:41.011920 | orchestrator | 2025-09-27 02:10:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:41.012172 | orchestrator | 2025-09-27 02:10:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:44.058316 | orchestrator | 2025-09-27 02:10:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:44.059502 | orchestrator | 2025-09-27 02:10:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:44.059577 | orchestrator | 2025-09-27 02:10:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:47.110484 | orchestrator | 2025-09-27 02:10:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:47.111839 | orchestrator | 2025-09-27 02:10:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:47.111966 | orchestrator | 2025-09-27 02:10:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:50.162119 | orchestrator | 2025-09-27 02:10:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:50.164497 | orchestrator | 2025-09-27 02:10:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:50.164541 | orchestrator | 2025-09-27 02:10:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:53.216753 | orchestrator | 2025-09-27 02:10:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:53.218826 | orchestrator | 2025-09-27 02:10:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:53.218856 | orchestrator | 2025-09-27 02:10:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:56.261700 | orchestrator | 2025-09-27 02:10:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:56.263093 | orchestrator | 2025-09-27 02:10:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:56.263121 | orchestrator | 2025-09-27 02:10:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:10:59.307674 | orchestrator | 2025-09-27 02:10:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:10:59.309718 | orchestrator | 2025-09-27 02:10:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:10:59.309744 | orchestrator | 2025-09-27 02:10:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:02.357024 | orchestrator | 2025-09-27 02:11:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:02.359553 | orchestrator | 2025-09-27 02:11:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:02.359592 | orchestrator | 2025-09-27 02:11:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:05.407946 | orchestrator | 2025-09-27 02:11:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:05.409511 | orchestrator | 2025-09-27 02:11:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:05.409539 | orchestrator | 2025-09-27 02:11:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:08.466331 | orchestrator | 2025-09-27 02:11:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:08.468068 | orchestrator | 2025-09-27 02:11:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:08.468468 | orchestrator | 2025-09-27 02:11:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:11.523849 | orchestrator | 2025-09-27 02:11:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:11.526197 | orchestrator | 2025-09-27 02:11:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:11.526308 | orchestrator | 2025-09-27 02:11:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:14.581699 | orchestrator | 2025-09-27 02:11:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:14.584090 | orchestrator | 2025-09-27 02:11:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:14.584119 | orchestrator | 2025-09-27 02:11:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:17.633912 | orchestrator | 2025-09-27 02:11:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:17.636555 | orchestrator | 2025-09-27 02:11:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:17.636665 | orchestrator | 2025-09-27 02:11:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:20.685463 | orchestrator | 2025-09-27 02:11:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:20.688081 | orchestrator | 2025-09-27 02:11:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:20.688182 | orchestrator | 2025-09-27 02:11:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:23.740028 | orchestrator | 2025-09-27 02:11:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:23.742166 | orchestrator | 2025-09-27 02:11:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:23.742209 | orchestrator | 2025-09-27 02:11:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:26.793133 | orchestrator | 2025-09-27 02:11:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:26.794841 | orchestrator | 2025-09-27 02:11:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:26.794918 | orchestrator | 2025-09-27 02:11:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:29.837086 | orchestrator | 2025-09-27 02:11:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:29.838467 | orchestrator | 2025-09-27 02:11:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:29.838502 | orchestrator | 2025-09-27 02:11:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:32.884545 | orchestrator | 2025-09-27 02:11:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:32.885720 | orchestrator | 2025-09-27 02:11:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:32.885747 | orchestrator | 2025-09-27 02:11:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:35.937132 | orchestrator | 2025-09-27 02:11:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:35.938111 | orchestrator | 2025-09-27 02:11:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:35.938170 | orchestrator | 2025-09-27 02:11:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:38.988053 | orchestrator | 2025-09-27 02:11:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:38.989068 | orchestrator | 2025-09-27 02:11:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:38.989102 | orchestrator | 2025-09-27 02:11:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:42.033310 | orchestrator | 2025-09-27 02:11:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:42.034560 | orchestrator | 2025-09-27 02:11:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:42.034745 | orchestrator | 2025-09-27 02:11:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:45.078715 | orchestrator | 2025-09-27 02:11:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:45.081247 | orchestrator | 2025-09-27 02:11:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:45.081328 | orchestrator | 2025-09-27 02:11:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:48.128998 | orchestrator | 2025-09-27 02:11:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:48.129903 | orchestrator | 2025-09-27 02:11:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:48.129929 | orchestrator | 2025-09-27 02:11:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:51.180455 | orchestrator | 2025-09-27 02:11:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:51.182261 | orchestrator | 2025-09-27 02:11:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:51.182286 | orchestrator | 2025-09-27 02:11:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:54.228423 | orchestrator | 2025-09-27 02:11:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:54.230650 | orchestrator | 2025-09-27 02:11:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:54.230725 | orchestrator | 2025-09-27 02:11:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:11:57.276438 | orchestrator | 2025-09-27 02:11:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:11:57.278335 | orchestrator | 2025-09-27 02:11:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:11:57.278367 | orchestrator | 2025-09-27 02:11:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:00.326561 | orchestrator | 2025-09-27 02:12:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:00.329592 | orchestrator | 2025-09-27 02:12:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:00.329671 | orchestrator | 2025-09-27 02:12:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:03.377953 | orchestrator | 2025-09-27 02:12:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:03.380059 | orchestrator | 2025-09-27 02:12:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:03.380092 | orchestrator | 2025-09-27 02:12:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:06.429954 | orchestrator | 2025-09-27 02:12:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:06.431371 | orchestrator | 2025-09-27 02:12:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:06.431397 | orchestrator | 2025-09-27 02:12:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:09.477995 | orchestrator | 2025-09-27 02:12:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:09.479676 | orchestrator | 2025-09-27 02:12:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:09.480000 | orchestrator | 2025-09-27 02:12:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:12.519013 | orchestrator | 2025-09-27 02:12:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:12.520508 | orchestrator | 2025-09-27 02:12:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:12.520813 | orchestrator | 2025-09-27 02:12:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:15.569200 | orchestrator | 2025-09-27 02:12:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:15.570924 | orchestrator | 2025-09-27 02:12:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:15.571297 | orchestrator | 2025-09-27 02:12:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:18.624019 | orchestrator | 2025-09-27 02:12:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:18.625971 | orchestrator | 2025-09-27 02:12:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:18.626233 | orchestrator | 2025-09-27 02:12:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:21.674573 | orchestrator | 2025-09-27 02:12:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:21.679195 | orchestrator | 2025-09-27 02:12:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:21.679308 | orchestrator | 2025-09-27 02:12:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:24.729721 | orchestrator | 2025-09-27 02:12:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:24.733079 | orchestrator | 2025-09-27 02:12:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:24.733112 | orchestrator | 2025-09-27 02:12:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:27.785588 | orchestrator | 2025-09-27 02:12:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:27.788520 | orchestrator | 2025-09-27 02:12:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:27.788548 | orchestrator | 2025-09-27 02:12:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:30.839753 | orchestrator | 2025-09-27 02:12:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:30.843923 | orchestrator | 2025-09-27 02:12:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:30.843949 | orchestrator | 2025-09-27 02:12:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:33.892297 | orchestrator | 2025-09-27 02:12:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:33.893779 | orchestrator | 2025-09-27 02:12:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:33.894057 | orchestrator | 2025-09-27 02:12:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:36.946561 | orchestrator | 2025-09-27 02:12:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:36.947707 | orchestrator | 2025-09-27 02:12:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:36.947994 | orchestrator | 2025-09-27 02:12:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:39.994695 | orchestrator | 2025-09-27 02:12:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:39.996750 | orchestrator | 2025-09-27 02:12:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:39.996775 | orchestrator | 2025-09-27 02:12:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:43.043467 | orchestrator | 2025-09-27 02:12:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:43.045372 | orchestrator | 2025-09-27 02:12:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:43.045418 | orchestrator | 2025-09-27 02:12:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:46.088805 | orchestrator | 2025-09-27 02:12:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:46.090763 | orchestrator | 2025-09-27 02:12:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:46.090804 | orchestrator | 2025-09-27 02:12:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:49.138856 | orchestrator | 2025-09-27 02:12:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:49.140567 | orchestrator | 2025-09-27 02:12:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:49.140644 | orchestrator | 2025-09-27 02:12:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:52.189450 | orchestrator | 2025-09-27 02:12:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:52.192069 | orchestrator | 2025-09-27 02:12:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:52.192124 | orchestrator | 2025-09-27 02:12:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:55.244840 | orchestrator | 2025-09-27 02:12:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:55.248385 | orchestrator | 2025-09-27 02:12:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:55.248415 | orchestrator | 2025-09-27 02:12:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:12:58.289877 | orchestrator | 2025-09-27 02:12:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:12:58.291031 | orchestrator | 2025-09-27 02:12:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:12:58.291302 | orchestrator | 2025-09-27 02:12:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:01.335628 | orchestrator | 2025-09-27 02:13:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:01.337239 | orchestrator | 2025-09-27 02:13:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:01.337268 | orchestrator | 2025-09-27 02:13:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:04.382490 | orchestrator | 2025-09-27 02:13:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:04.384061 | orchestrator | 2025-09-27 02:13:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:04.384090 | orchestrator | 2025-09-27 02:13:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:07.431440 | orchestrator | 2025-09-27 02:13:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:07.432988 | orchestrator | 2025-09-27 02:13:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:07.433095 | orchestrator | 2025-09-27 02:13:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:10.476607 | orchestrator | 2025-09-27 02:13:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:10.478132 | orchestrator | 2025-09-27 02:13:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:10.478211 | orchestrator | 2025-09-27 02:13:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:13.517244 | orchestrator | 2025-09-27 02:13:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:13.518801 | orchestrator | 2025-09-27 02:13:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:13.518878 | orchestrator | 2025-09-27 02:13:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:16.564631 | orchestrator | 2025-09-27 02:13:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:16.566147 | orchestrator | 2025-09-27 02:13:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:16.566270 | orchestrator | 2025-09-27 02:13:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:19.615851 | orchestrator | 2025-09-27 02:13:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:19.619467 | orchestrator | 2025-09-27 02:13:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:19.619543 | orchestrator | 2025-09-27 02:13:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:22.690978 | orchestrator | 2025-09-27 02:13:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:22.691042 | orchestrator | 2025-09-27 02:13:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:22.691056 | orchestrator | 2025-09-27 02:13:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:25.740638 | orchestrator | 2025-09-27 02:13:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:25.740712 | orchestrator | 2025-09-27 02:13:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:25.740725 | orchestrator | 2025-09-27 02:13:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:28.798125 | orchestrator | 2025-09-27 02:13:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:28.801242 | orchestrator | 2025-09-27 02:13:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:28.801292 | orchestrator | 2025-09-27 02:13:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:31.852450 | orchestrator | 2025-09-27 02:13:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:31.852548 | orchestrator | 2025-09-27 02:13:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:31.852562 | orchestrator | 2025-09-27 02:13:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:34.902458 | orchestrator | 2025-09-27 02:13:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:34.904049 | orchestrator | 2025-09-27 02:13:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:34.904083 | orchestrator | 2025-09-27 02:13:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:37.946585 | orchestrator | 2025-09-27 02:13:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:37.948153 | orchestrator | 2025-09-27 02:13:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:37.948741 | orchestrator | 2025-09-27 02:13:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:41.005021 | orchestrator | 2025-09-27 02:13:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:41.006146 | orchestrator | 2025-09-27 02:13:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:41.006297 | orchestrator | 2025-09-27 02:13:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:44.056318 | orchestrator | 2025-09-27 02:13:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:44.057749 | orchestrator | 2025-09-27 02:13:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:44.057775 | orchestrator | 2025-09-27 02:13:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:47.123891 | orchestrator | 2025-09-27 02:13:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:47.125698 | orchestrator | 2025-09-27 02:13:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:47.125816 | orchestrator | 2025-09-27 02:13:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:50.174330 | orchestrator | 2025-09-27 02:13:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:50.175977 | orchestrator | 2025-09-27 02:13:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:50.176287 | orchestrator | 2025-09-27 02:13:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:53.268739 | orchestrator | 2025-09-27 02:13:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:53.269744 | orchestrator | 2025-09-27 02:13:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:53.269781 | orchestrator | 2025-09-27 02:13:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:56.319979 | orchestrator | 2025-09-27 02:13:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:56.321861 | orchestrator | 2025-09-27 02:13:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:56.321895 | orchestrator | 2025-09-27 02:13:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:13:59.379212 | orchestrator | 2025-09-27 02:13:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:13:59.380945 | orchestrator | 2025-09-27 02:13:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:13:59.380977 | orchestrator | 2025-09-27 02:13:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:02.424525 | orchestrator | 2025-09-27 02:14:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:02.426779 | orchestrator | 2025-09-27 02:14:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:02.426816 | orchestrator | 2025-09-27 02:14:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:05.478251 | orchestrator | 2025-09-27 02:14:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:05.479358 | orchestrator | 2025-09-27 02:14:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:05.479421 | orchestrator | 2025-09-27 02:14:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:08.530916 | orchestrator | 2025-09-27 02:14:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:08.531797 | orchestrator | 2025-09-27 02:14:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:08.531828 | orchestrator | 2025-09-27 02:14:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:11.585959 | orchestrator | 2025-09-27 02:14:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:11.588627 | orchestrator | 2025-09-27 02:14:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:11.588664 | orchestrator | 2025-09-27 02:14:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:14.634164 | orchestrator | 2025-09-27 02:14:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:14.635704 | orchestrator | 2025-09-27 02:14:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:14.635743 | orchestrator | 2025-09-27 02:14:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:17.686064 | orchestrator | 2025-09-27 02:14:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:17.687376 | orchestrator | 2025-09-27 02:14:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:17.687415 | orchestrator | 2025-09-27 02:14:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:20.735921 | orchestrator | 2025-09-27 02:14:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:20.737110 | orchestrator | 2025-09-27 02:14:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:20.737143 | orchestrator | 2025-09-27 02:14:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:23.781593 | orchestrator | 2025-09-27 02:14:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:23.783110 | orchestrator | 2025-09-27 02:14:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:23.783397 | orchestrator | 2025-09-27 02:14:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:26.835105 | orchestrator | 2025-09-27 02:14:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:26.836313 | orchestrator | 2025-09-27 02:14:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:26.836353 | orchestrator | 2025-09-27 02:14:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:29.881262 | orchestrator | 2025-09-27 02:14:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:29.883361 | orchestrator | 2025-09-27 02:14:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:29.883519 | orchestrator | 2025-09-27 02:14:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:32.929110 | orchestrator | 2025-09-27 02:14:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:32.931343 | orchestrator | 2025-09-27 02:14:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:32.931374 | orchestrator | 2025-09-27 02:14:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:35.981678 | orchestrator | 2025-09-27 02:14:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:35.983511 | orchestrator | 2025-09-27 02:14:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:35.983581 | orchestrator | 2025-09-27 02:14:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:39.034101 | orchestrator | 2025-09-27 02:14:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:39.035830 | orchestrator | 2025-09-27 02:14:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:39.035862 | orchestrator | 2025-09-27 02:14:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:42.084276 | orchestrator | 2025-09-27 02:14:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:42.085465 | orchestrator | 2025-09-27 02:14:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:42.085494 | orchestrator | 2025-09-27 02:14:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:45.126452 | orchestrator | 2025-09-27 02:14:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:45.128079 | orchestrator | 2025-09-27 02:14:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:45.128316 | orchestrator | 2025-09-27 02:14:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:48.173761 | orchestrator | 2025-09-27 02:14:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:48.175472 | orchestrator | 2025-09-27 02:14:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:48.175535 | orchestrator | 2025-09-27 02:14:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:51.223634 | orchestrator | 2025-09-27 02:14:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:51.225693 | orchestrator | 2025-09-27 02:14:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:51.225783 | orchestrator | 2025-09-27 02:14:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:54.271622 | orchestrator | 2025-09-27 02:14:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:54.273955 | orchestrator | 2025-09-27 02:14:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:54.274383 | orchestrator | 2025-09-27 02:14:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:14:57.316732 | orchestrator | 2025-09-27 02:14:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:14:57.318691 | orchestrator | 2025-09-27 02:14:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:14:57.318769 | orchestrator | 2025-09-27 02:14:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:00.370403 | orchestrator | 2025-09-27 02:15:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:00.371929 | orchestrator | 2025-09-27 02:15:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:00.371959 | orchestrator | 2025-09-27 02:15:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:03.421591 | orchestrator | 2025-09-27 02:15:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:03.424423 | orchestrator | 2025-09-27 02:15:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:03.424815 | orchestrator | 2025-09-27 02:15:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:06.475455 | orchestrator | 2025-09-27 02:15:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:06.477260 | orchestrator | 2025-09-27 02:15:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:06.477384 | orchestrator | 2025-09-27 02:15:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:09.526510 | orchestrator | 2025-09-27 02:15:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:09.527781 | orchestrator | 2025-09-27 02:15:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:09.527899 | orchestrator | 2025-09-27 02:15:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:12.570807 | orchestrator | 2025-09-27 02:15:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:12.573233 | orchestrator | 2025-09-27 02:15:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:12.573308 | orchestrator | 2025-09-27 02:15:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:15.618275 | orchestrator | 2025-09-27 02:15:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:15.619901 | orchestrator | 2025-09-27 02:15:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:15.620106 | orchestrator | 2025-09-27 02:15:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:18.666784 | orchestrator | 2025-09-27 02:15:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:18.668363 | orchestrator | 2025-09-27 02:15:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:18.668392 | orchestrator | 2025-09-27 02:15:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:21.719281 | orchestrator | 2025-09-27 02:15:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:21.720494 | orchestrator | 2025-09-27 02:15:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:21.720541 | orchestrator | 2025-09-27 02:15:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:24.763667 | orchestrator | 2025-09-27 02:15:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:24.766350 | orchestrator | 2025-09-27 02:15:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:24.766383 | orchestrator | 2025-09-27 02:15:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:27.817488 | orchestrator | 2025-09-27 02:15:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:27.818273 | orchestrator | 2025-09-27 02:15:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:27.818964 | orchestrator | 2025-09-27 02:15:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:30.868031 | orchestrator | 2025-09-27 02:15:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:30.869988 | orchestrator | 2025-09-27 02:15:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:30.870157 | orchestrator | 2025-09-27 02:15:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:33.918667 | orchestrator | 2025-09-27 02:15:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:33.919911 | orchestrator | 2025-09-27 02:15:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:33.919978 | orchestrator | 2025-09-27 02:15:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:36.967850 | orchestrator | 2025-09-27 02:15:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:36.969673 | orchestrator | 2025-09-27 02:15:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:36.970223 | orchestrator | 2025-09-27 02:15:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:40.010516 | orchestrator | 2025-09-27 02:15:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:40.011733 | orchestrator | 2025-09-27 02:15:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:40.011772 | orchestrator | 2025-09-27 02:15:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:43.056426 | orchestrator | 2025-09-27 02:15:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:43.058480 | orchestrator | 2025-09-27 02:15:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:43.058581 | orchestrator | 2025-09-27 02:15:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:46.100347 | orchestrator | 2025-09-27 02:15:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:46.102332 | orchestrator | 2025-09-27 02:15:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:46.102363 | orchestrator | 2025-09-27 02:15:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:49.148528 | orchestrator | 2025-09-27 02:15:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:49.150004 | orchestrator | 2025-09-27 02:15:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:49.150085 | orchestrator | 2025-09-27 02:15:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:52.190823 | orchestrator | 2025-09-27 02:15:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:52.193568 | orchestrator | 2025-09-27 02:15:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:52.193605 | orchestrator | 2025-09-27 02:15:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:55.236015 | orchestrator | 2025-09-27 02:15:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:55.237559 | orchestrator | 2025-09-27 02:15:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:55.238098 | orchestrator | 2025-09-27 02:15:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:15:58.286885 | orchestrator | 2025-09-27 02:15:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:15:58.288743 | orchestrator | 2025-09-27 02:15:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:15:58.288809 | orchestrator | 2025-09-27 02:15:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:01.330542 | orchestrator | 2025-09-27 02:16:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:01.332069 | orchestrator | 2025-09-27 02:16:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:01.332260 | orchestrator | 2025-09-27 02:16:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:04.378913 | orchestrator | 2025-09-27 02:16:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:04.380567 | orchestrator | 2025-09-27 02:16:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:04.380679 | orchestrator | 2025-09-27 02:16:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:07.425599 | orchestrator | 2025-09-27 02:16:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:07.427277 | orchestrator | 2025-09-27 02:16:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:07.427571 | orchestrator | 2025-09-27 02:16:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:10.472329 | orchestrator | 2025-09-27 02:16:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:10.474126 | orchestrator | 2025-09-27 02:16:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:10.474602 | orchestrator | 2025-09-27 02:16:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:13.523158 | orchestrator | 2025-09-27 02:16:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:13.524438 | orchestrator | 2025-09-27 02:16:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:13.524557 | orchestrator | 2025-09-27 02:16:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:16.565621 | orchestrator | 2025-09-27 02:16:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:16.567811 | orchestrator | 2025-09-27 02:16:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:16.567847 | orchestrator | 2025-09-27 02:16:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:19.604973 | orchestrator | 2025-09-27 02:16:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:19.606329 | orchestrator | 2025-09-27 02:16:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:19.606361 | orchestrator | 2025-09-27 02:16:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:22.654487 | orchestrator | 2025-09-27 02:16:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:22.655017 | orchestrator | 2025-09-27 02:16:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:22.655048 | orchestrator | 2025-09-27 02:16:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:25.700842 | orchestrator | 2025-09-27 02:16:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:25.700999 | orchestrator | 2025-09-27 02:16:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:25.701018 | orchestrator | 2025-09-27 02:16:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:28.752528 | orchestrator | 2025-09-27 02:16:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:28.754724 | orchestrator | 2025-09-27 02:16:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:28.754844 | orchestrator | 2025-09-27 02:16:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:31.807428 | orchestrator | 2025-09-27 02:16:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:31.809957 | orchestrator | 2025-09-27 02:16:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:31.810011 | orchestrator | 2025-09-27 02:16:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:34.856032 | orchestrator | 2025-09-27 02:16:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:34.858108 | orchestrator | 2025-09-27 02:16:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:34.858141 | orchestrator | 2025-09-27 02:16:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:37.921820 | orchestrator | 2025-09-27 02:16:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:37.924582 | orchestrator | 2025-09-27 02:16:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:37.924596 | orchestrator | 2025-09-27 02:16:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:40.972152 | orchestrator | 2025-09-27 02:16:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:40.973958 | orchestrator | 2025-09-27 02:16:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:40.973991 | orchestrator | 2025-09-27 02:16:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:44.023122 | orchestrator | 2025-09-27 02:16:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:44.024709 | orchestrator | 2025-09-27 02:16:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:44.024758 | orchestrator | 2025-09-27 02:16:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:47.076088 | orchestrator | 2025-09-27 02:16:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:47.078956 | orchestrator | 2025-09-27 02:16:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:47.079047 | orchestrator | 2025-09-27 02:16:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:50.123871 | orchestrator | 2025-09-27 02:16:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:50.125652 | orchestrator | 2025-09-27 02:16:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:50.125696 | orchestrator | 2025-09-27 02:16:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:53.184118 | orchestrator | 2025-09-27 02:16:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:53.186003 | orchestrator | 2025-09-27 02:16:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:53.186123 | orchestrator | 2025-09-27 02:16:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:56.225107 | orchestrator | 2025-09-27 02:16:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:56.226724 | orchestrator | 2025-09-27 02:16:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:56.226976 | orchestrator | 2025-09-27 02:16:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:16:59.270812 | orchestrator | 2025-09-27 02:16:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:16:59.272809 | orchestrator | 2025-09-27 02:16:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:16:59.273133 | orchestrator | 2025-09-27 02:16:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:02.320327 | orchestrator | 2025-09-27 02:17:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:02.321698 | orchestrator | 2025-09-27 02:17:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:02.321970 | orchestrator | 2025-09-27 02:17:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:05.369192 | orchestrator | 2025-09-27 02:17:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:05.371958 | orchestrator | 2025-09-27 02:17:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:05.372046 | orchestrator | 2025-09-27 02:17:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:08.421753 | orchestrator | 2025-09-27 02:17:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:08.423952 | orchestrator | 2025-09-27 02:17:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:08.424032 | orchestrator | 2025-09-27 02:17:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:11.472501 | orchestrator | 2025-09-27 02:17:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:11.474984 | orchestrator | 2025-09-27 02:17:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:11.475038 | orchestrator | 2025-09-27 02:17:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:14.525410 | orchestrator | 2025-09-27 02:17:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:14.543692 | orchestrator | 2025-09-27 02:17:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:14.543734 | orchestrator | 2025-09-27 02:17:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:17.573306 | orchestrator | 2025-09-27 02:17:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:17.573842 | orchestrator | 2025-09-27 02:17:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:17.573961 | orchestrator | 2025-09-27 02:17:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:20.621601 | orchestrator | 2025-09-27 02:17:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:20.624909 | orchestrator | 2025-09-27 02:17:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:20.624961 | orchestrator | 2025-09-27 02:17:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:23.668993 | orchestrator | 2025-09-27 02:17:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:23.670600 | orchestrator | 2025-09-27 02:17:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:23.670947 | orchestrator | 2025-09-27 02:17:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:26.722180 | orchestrator | 2025-09-27 02:17:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:26.723706 | orchestrator | 2025-09-27 02:17:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:26.723736 | orchestrator | 2025-09-27 02:17:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:29.778255 | orchestrator | 2025-09-27 02:17:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:29.780173 | orchestrator | 2025-09-27 02:17:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:29.780308 | orchestrator | 2025-09-27 02:17:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:32.831659 | orchestrator | 2025-09-27 02:17:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:32.833787 | orchestrator | 2025-09-27 02:17:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:32.834086 | orchestrator | 2025-09-27 02:17:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:35.884059 | orchestrator | 2025-09-27 02:17:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:35.885307 | orchestrator | 2025-09-27 02:17:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:35.885569 | orchestrator | 2025-09-27 02:17:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:38.934583 | orchestrator | 2025-09-27 02:17:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:38.936692 | orchestrator | 2025-09-27 02:17:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:38.937076 | orchestrator | 2025-09-27 02:17:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:41.988186 | orchestrator | 2025-09-27 02:17:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:41.991136 | orchestrator | 2025-09-27 02:17:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:41.991170 | orchestrator | 2025-09-27 02:17:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:45.043302 | orchestrator | 2025-09-27 02:17:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:45.044644 | orchestrator | 2025-09-27 02:17:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:45.044675 | orchestrator | 2025-09-27 02:17:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:48.096885 | orchestrator | 2025-09-27 02:17:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:48.098203 | orchestrator | 2025-09-27 02:17:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:48.098281 | orchestrator | 2025-09-27 02:17:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:51.146309 | orchestrator | 2025-09-27 02:17:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:51.147850 | orchestrator | 2025-09-27 02:17:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:51.147990 | orchestrator | 2025-09-27 02:17:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:54.195008 | orchestrator | 2025-09-27 02:17:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:54.196183 | orchestrator | 2025-09-27 02:17:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:54.196262 | orchestrator | 2025-09-27 02:17:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:17:57.238720 | orchestrator | 2025-09-27 02:17:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:17:57.240399 | orchestrator | 2025-09-27 02:17:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:17:57.240431 | orchestrator | 2025-09-27 02:17:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:00.287836 | orchestrator | 2025-09-27 02:18:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:00.288837 | orchestrator | 2025-09-27 02:18:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:00.288885 | orchestrator | 2025-09-27 02:18:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:03.336353 | orchestrator | 2025-09-27 02:18:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:03.336647 | orchestrator | 2025-09-27 02:18:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:03.336674 | orchestrator | 2025-09-27 02:18:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:06.388156 | orchestrator | 2025-09-27 02:18:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:06.389768 | orchestrator | 2025-09-27 02:18:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:06.389869 | orchestrator | 2025-09-27 02:18:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:09.441949 | orchestrator | 2025-09-27 02:18:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:09.444610 | orchestrator | 2025-09-27 02:18:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:09.444723 | orchestrator | 2025-09-27 02:18:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:12.495158 | orchestrator | 2025-09-27 02:18:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:12.496545 | orchestrator | 2025-09-27 02:18:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:12.496579 | orchestrator | 2025-09-27 02:18:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:15.546297 | orchestrator | 2025-09-27 02:18:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:15.548285 | orchestrator | 2025-09-27 02:18:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:15.548319 | orchestrator | 2025-09-27 02:18:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:18.600250 | orchestrator | 2025-09-27 02:18:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:18.600715 | orchestrator | 2025-09-27 02:18:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:18.600747 | orchestrator | 2025-09-27 02:18:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:21.647284 | orchestrator | 2025-09-27 02:18:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:21.648381 | orchestrator | 2025-09-27 02:18:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:21.648412 | orchestrator | 2025-09-27 02:18:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:24.696758 | orchestrator | 2025-09-27 02:18:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:24.698464 | orchestrator | 2025-09-27 02:18:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:24.698497 | orchestrator | 2025-09-27 02:18:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:27.746724 | orchestrator | 2025-09-27 02:18:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:27.748563 | orchestrator | 2025-09-27 02:18:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:27.748594 | orchestrator | 2025-09-27 02:18:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:30.801831 | orchestrator | 2025-09-27 02:18:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:30.803481 | orchestrator | 2025-09-27 02:18:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:30.803547 | orchestrator | 2025-09-27 02:18:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:33.848214 | orchestrator | 2025-09-27 02:18:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:33.849385 | orchestrator | 2025-09-27 02:18:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:33.849402 | orchestrator | 2025-09-27 02:18:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:36.892473 | orchestrator | 2025-09-27 02:18:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:36.893762 | orchestrator | 2025-09-27 02:18:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:36.893789 | orchestrator | 2025-09-27 02:18:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:39.937371 | orchestrator | 2025-09-27 02:18:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:39.937589 | orchestrator | 2025-09-27 02:18:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:39.937656 | orchestrator | 2025-09-27 02:18:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:42.980558 | orchestrator | 2025-09-27 02:18:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:42.981989 | orchestrator | 2025-09-27 02:18:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:42.982386 | orchestrator | 2025-09-27 02:18:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:46.033131 | orchestrator | 2025-09-27 02:18:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:46.034088 | orchestrator | 2025-09-27 02:18:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:46.034484 | orchestrator | 2025-09-27 02:18:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:49.078935 | orchestrator | 2025-09-27 02:18:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:49.080838 | orchestrator | 2025-09-27 02:18:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:49.080870 | orchestrator | 2025-09-27 02:18:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:52.129583 | orchestrator | 2025-09-27 02:18:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:52.131007 | orchestrator | 2025-09-27 02:18:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:52.131129 | orchestrator | 2025-09-27 02:18:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:55.175883 | orchestrator | 2025-09-27 02:18:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:55.177706 | orchestrator | 2025-09-27 02:18:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:55.177738 | orchestrator | 2025-09-27 02:18:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:18:58.228291 | orchestrator | 2025-09-27 02:18:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:18:58.230767 | orchestrator | 2025-09-27 02:18:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:18:58.230803 | orchestrator | 2025-09-27 02:18:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:01.283379 | orchestrator | 2025-09-27 02:19:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:01.286878 | orchestrator | 2025-09-27 02:19:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:01.286913 | orchestrator | 2025-09-27 02:19:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:04.335732 | orchestrator | 2025-09-27 02:19:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:04.337069 | orchestrator | 2025-09-27 02:19:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:04.337116 | orchestrator | 2025-09-27 02:19:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:07.382820 | orchestrator | 2025-09-27 02:19:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:07.385292 | orchestrator | 2025-09-27 02:19:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:07.385524 | orchestrator | 2025-09-27 02:19:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:10.433379 | orchestrator | 2025-09-27 02:19:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:10.434940 | orchestrator | 2025-09-27 02:19:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:10.435056 | orchestrator | 2025-09-27 02:19:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:13.486088 | orchestrator | 2025-09-27 02:19:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:13.488135 | orchestrator | 2025-09-27 02:19:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:13.488358 | orchestrator | 2025-09-27 02:19:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:16.539501 | orchestrator | 2025-09-27 02:19:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:16.540932 | orchestrator | 2025-09-27 02:19:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:16.541137 | orchestrator | 2025-09-27 02:19:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:19.584017 | orchestrator | 2025-09-27 02:19:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:19.584420 | orchestrator | 2025-09-27 02:19:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:19.584449 | orchestrator | 2025-09-27 02:19:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:22.628692 | orchestrator | 2025-09-27 02:19:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:22.630606 | orchestrator | 2025-09-27 02:19:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:22.630736 | orchestrator | 2025-09-27 02:19:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:25.674905 | orchestrator | 2025-09-27 02:19:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:25.677330 | orchestrator | 2025-09-27 02:19:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:25.677369 | orchestrator | 2025-09-27 02:19:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:28.726739 | orchestrator | 2025-09-27 02:19:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:28.728616 | orchestrator | 2025-09-27 02:19:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:28.728646 | orchestrator | 2025-09-27 02:19:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:31.781792 | orchestrator | 2025-09-27 02:19:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:31.783679 | orchestrator | 2025-09-27 02:19:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:31.783813 | orchestrator | 2025-09-27 02:19:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:34.830801 | orchestrator | 2025-09-27 02:19:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:34.832132 | orchestrator | 2025-09-27 02:19:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:34.832474 | orchestrator | 2025-09-27 02:19:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:37.881991 | orchestrator | 2025-09-27 02:19:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:37.882886 | orchestrator | 2025-09-27 02:19:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:37.882920 | orchestrator | 2025-09-27 02:19:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:40.934368 | orchestrator | 2025-09-27 02:19:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:40.935511 | orchestrator | 2025-09-27 02:19:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:40.935605 | orchestrator | 2025-09-27 02:19:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:43.982429 | orchestrator | 2025-09-27 02:19:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:43.983776 | orchestrator | 2025-09-27 02:19:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:43.983810 | orchestrator | 2025-09-27 02:19:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:47.043406 | orchestrator | 2025-09-27 02:19:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:47.047367 | orchestrator | 2025-09-27 02:19:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:47.047975 | orchestrator | 2025-09-27 02:19:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:50.083174 | orchestrator | 2025-09-27 02:19:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:50.086132 | orchestrator | 2025-09-27 02:19:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:50.086167 | orchestrator | 2025-09-27 02:19:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:53.131069 | orchestrator | 2025-09-27 02:19:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:53.131754 | orchestrator | 2025-09-27 02:19:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:53.131788 | orchestrator | 2025-09-27 02:19:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:56.173775 | orchestrator | 2025-09-27 02:19:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:56.175143 | orchestrator | 2025-09-27 02:19:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:56.175368 | orchestrator | 2025-09-27 02:19:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:19:59.221357 | orchestrator | 2025-09-27 02:19:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:19:59.222718 | orchestrator | 2025-09-27 02:19:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:19:59.222751 | orchestrator | 2025-09-27 02:19:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:02.272278 | orchestrator | 2025-09-27 02:20:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:02.273558 | orchestrator | 2025-09-27 02:20:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:02.273633 | orchestrator | 2025-09-27 02:20:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:05.316829 | orchestrator | 2025-09-27 02:20:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:05.318458 | orchestrator | 2025-09-27 02:20:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:05.318792 | orchestrator | 2025-09-27 02:20:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:08.361785 | orchestrator | 2025-09-27 02:20:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:08.363636 | orchestrator | 2025-09-27 02:20:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:08.363852 | orchestrator | 2025-09-27 02:20:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:11.404348 | orchestrator | 2025-09-27 02:20:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:11.407061 | orchestrator | 2025-09-27 02:20:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:11.407139 | orchestrator | 2025-09-27 02:20:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:14.448681 | orchestrator | 2025-09-27 02:20:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:14.451113 | orchestrator | 2025-09-27 02:20:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:14.451152 | orchestrator | 2025-09-27 02:20:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:17.504182 | orchestrator | 2025-09-27 02:20:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:17.506081 | orchestrator | 2025-09-27 02:20:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:17.506208 | orchestrator | 2025-09-27 02:20:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:20.557128 | orchestrator | 2025-09-27 02:20:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:20.559473 | orchestrator | 2025-09-27 02:20:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:20.559992 | orchestrator | 2025-09-27 02:20:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:23.612759 | orchestrator | 2025-09-27 02:20:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:23.615692 | orchestrator | 2025-09-27 02:20:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:23.616223 | orchestrator | 2025-09-27 02:20:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:26.664194 | orchestrator | 2025-09-27 02:20:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:26.667227 | orchestrator | 2025-09-27 02:20:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:26.667290 | orchestrator | 2025-09-27 02:20:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:29.717286 | orchestrator | 2025-09-27 02:20:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:29.718879 | orchestrator | 2025-09-27 02:20:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:29.718945 | orchestrator | 2025-09-27 02:20:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:32.768212 | orchestrator | 2025-09-27 02:20:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:32.769097 | orchestrator | 2025-09-27 02:20:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:32.769175 | orchestrator | 2025-09-27 02:20:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:35.816548 | orchestrator | 2025-09-27 02:20:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:35.817946 | orchestrator | 2025-09-27 02:20:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:35.817979 | orchestrator | 2025-09-27 02:20:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:38.864497 | orchestrator | 2025-09-27 02:20:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:38.866353 | orchestrator | 2025-09-27 02:20:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:38.866441 | orchestrator | 2025-09-27 02:20:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:41.911555 | orchestrator | 2025-09-27 02:20:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:41.912957 | orchestrator | 2025-09-27 02:20:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:41.913060 | orchestrator | 2025-09-27 02:20:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:44.956075 | orchestrator | 2025-09-27 02:20:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:44.957263 | orchestrator | 2025-09-27 02:20:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:44.957348 | orchestrator | 2025-09-27 02:20:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:48.008893 | orchestrator | 2025-09-27 02:20:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:48.010212 | orchestrator | 2025-09-27 02:20:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:48.010284 | orchestrator | 2025-09-27 02:20:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:51.061048 | orchestrator | 2025-09-27 02:20:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:51.062560 | orchestrator | 2025-09-27 02:20:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:51.062594 | orchestrator | 2025-09-27 02:20:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:54.111071 | orchestrator | 2025-09-27 02:20:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:54.112550 | orchestrator | 2025-09-27 02:20:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:54.112576 | orchestrator | 2025-09-27 02:20:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:20:57.153417 | orchestrator | 2025-09-27 02:20:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:20:57.156026 | orchestrator | 2025-09-27 02:20:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:20:57.156056 | orchestrator | 2025-09-27 02:20:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:00.201146 | orchestrator | 2025-09-27 02:21:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:00.202510 | orchestrator | 2025-09-27 02:21:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:00.202539 | orchestrator | 2025-09-27 02:21:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:03.256468 | orchestrator | 2025-09-27 02:21:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:03.258330 | orchestrator | 2025-09-27 02:21:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:03.259437 | orchestrator | 2025-09-27 02:21:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:06.311183 | orchestrator | 2025-09-27 02:21:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:06.313452 | orchestrator | 2025-09-27 02:21:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:06.313560 | orchestrator | 2025-09-27 02:21:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:09.363762 | orchestrator | 2025-09-27 02:21:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:09.365171 | orchestrator | 2025-09-27 02:21:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:09.365199 | orchestrator | 2025-09-27 02:21:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:12.422435 | orchestrator | 2025-09-27 02:21:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:12.424467 | orchestrator | 2025-09-27 02:21:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:12.424494 | orchestrator | 2025-09-27 02:21:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:15.475686 | orchestrator | 2025-09-27 02:21:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:15.476480 | orchestrator | 2025-09-27 02:21:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:15.477398 | orchestrator | 2025-09-27 02:21:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:18.539538 | orchestrator | 2025-09-27 02:21:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:18.539643 | orchestrator | 2025-09-27 02:21:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:18.539660 | orchestrator | 2025-09-27 02:21:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:21.587564 | orchestrator | 2025-09-27 02:21:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:21.587667 | orchestrator | 2025-09-27 02:21:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:21.587683 | orchestrator | 2025-09-27 02:21:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:24.630983 | orchestrator | 2025-09-27 02:21:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:24.631151 | orchestrator | 2025-09-27 02:21:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:24.631173 | orchestrator | 2025-09-27 02:21:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:27.679420 | orchestrator | 2025-09-27 02:21:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:27.680531 | orchestrator | 2025-09-27 02:21:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:27.680603 | orchestrator | 2025-09-27 02:21:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:30.730687 | orchestrator | 2025-09-27 02:21:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:30.731139 | orchestrator | 2025-09-27 02:21:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:30.731561 | orchestrator | 2025-09-27 02:21:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:33.784093 | orchestrator | 2025-09-27 02:21:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:33.786005 | orchestrator | 2025-09-27 02:21:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:33.786200 | orchestrator | 2025-09-27 02:21:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:36.847429 | orchestrator | 2025-09-27 02:21:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:36.848929 | orchestrator | 2025-09-27 02:21:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:36.849448 | orchestrator | 2025-09-27 02:21:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:39.897564 | orchestrator | 2025-09-27 02:21:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:39.897759 | orchestrator | 2025-09-27 02:21:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:39.897782 | orchestrator | 2025-09-27 02:21:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:42.945335 | orchestrator | 2025-09-27 02:21:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:42.946910 | orchestrator | 2025-09-27 02:21:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:42.946939 | orchestrator | 2025-09-27 02:21:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:45.997457 | orchestrator | 2025-09-27 02:21:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:45.999187 | orchestrator | 2025-09-27 02:21:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:45.999347 | orchestrator | 2025-09-27 02:21:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:49.047705 | orchestrator | 2025-09-27 02:21:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:49.048897 | orchestrator | 2025-09-27 02:21:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:49.049186 | orchestrator | 2025-09-27 02:21:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:52.099194 | orchestrator | 2025-09-27 02:21:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:52.099848 | orchestrator | 2025-09-27 02:21:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:52.099879 | orchestrator | 2025-09-27 02:21:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:55.144570 | orchestrator | 2025-09-27 02:21:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:55.145935 | orchestrator | 2025-09-27 02:21:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:55.146681 | orchestrator | 2025-09-27 02:21:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:21:58.194771 | orchestrator | 2025-09-27 02:21:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:21:58.197045 | orchestrator | 2025-09-27 02:21:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:21:58.197089 | orchestrator | 2025-09-27 02:21:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:01.244984 | orchestrator | 2025-09-27 02:22:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:01.246193 | orchestrator | 2025-09-27 02:22:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:01.246224 | orchestrator | 2025-09-27 02:22:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:04.294006 | orchestrator | 2025-09-27 02:22:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:04.295143 | orchestrator | 2025-09-27 02:22:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:04.295174 | orchestrator | 2025-09-27 02:22:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:07.352831 | orchestrator | 2025-09-27 02:22:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:07.356628 | orchestrator | 2025-09-27 02:22:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:07.356740 | orchestrator | 2025-09-27 02:22:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:10.402788 | orchestrator | 2025-09-27 02:22:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:10.404449 | orchestrator | 2025-09-27 02:22:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:10.404694 | orchestrator | 2025-09-27 02:22:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:13.453947 | orchestrator | 2025-09-27 02:22:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:13.455134 | orchestrator | 2025-09-27 02:22:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:13.455234 | orchestrator | 2025-09-27 02:22:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:16.498648 | orchestrator | 2025-09-27 02:22:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:16.500083 | orchestrator | 2025-09-27 02:22:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:16.500110 | orchestrator | 2025-09-27 02:22:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:19.547371 | orchestrator | 2025-09-27 02:22:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:19.550359 | orchestrator | 2025-09-27 02:22:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:19.550548 | orchestrator | 2025-09-27 02:22:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:22.603158 | orchestrator | 2025-09-27 02:22:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:22.605700 | orchestrator | 2025-09-27 02:22:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:22.605853 | orchestrator | 2025-09-27 02:22:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:25.650854 | orchestrator | 2025-09-27 02:22:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:25.654316 | orchestrator | 2025-09-27 02:22:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:25.654349 | orchestrator | 2025-09-27 02:22:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:28.703703 | orchestrator | 2025-09-27 02:22:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:28.705137 | orchestrator | 2025-09-27 02:22:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:28.705220 | orchestrator | 2025-09-27 02:22:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:31.755432 | orchestrator | 2025-09-27 02:22:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:31.757800 | orchestrator | 2025-09-27 02:22:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:31.757893 | orchestrator | 2025-09-27 02:22:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:34.805061 | orchestrator | 2025-09-27 02:22:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:34.805899 | orchestrator | 2025-09-27 02:22:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:34.806399 | orchestrator | 2025-09-27 02:22:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:37.855118 | orchestrator | 2025-09-27 02:22:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:37.856427 | orchestrator | 2025-09-27 02:22:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:37.856653 | orchestrator | 2025-09-27 02:22:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:40.905853 | orchestrator | 2025-09-27 02:22:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:40.906860 | orchestrator | 2025-09-27 02:22:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:40.906933 | orchestrator | 2025-09-27 02:22:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:43.956396 | orchestrator | 2025-09-27 02:22:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:43.958198 | orchestrator | 2025-09-27 02:22:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:43.958231 | orchestrator | 2025-09-27 02:22:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:47.006137 | orchestrator | 2025-09-27 02:22:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:47.007975 | orchestrator | 2025-09-27 02:22:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:47.008097 | orchestrator | 2025-09-27 02:22:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:50.060028 | orchestrator | 2025-09-27 02:22:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:50.061353 | orchestrator | 2025-09-27 02:22:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:50.061384 | orchestrator | 2025-09-27 02:22:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:53.113364 | orchestrator | 2025-09-27 02:22:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:53.115457 | orchestrator | 2025-09-27 02:22:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:53.115748 | orchestrator | 2025-09-27 02:22:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:56.157622 | orchestrator | 2025-09-27 02:22:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:56.160148 | orchestrator | 2025-09-27 02:22:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:56.160243 | orchestrator | 2025-09-27 02:22:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:22:59.208246 | orchestrator | 2025-09-27 02:22:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:22:59.210282 | orchestrator | 2025-09-27 02:22:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:22:59.210312 | orchestrator | 2025-09-27 02:22:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:02.259230 | orchestrator | 2025-09-27 02:23:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:02.260996 | orchestrator | 2025-09-27 02:23:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:02.261028 | orchestrator | 2025-09-27 02:23:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:05.306205 | orchestrator | 2025-09-27 02:23:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:05.309902 | orchestrator | 2025-09-27 02:23:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:05.310179 | orchestrator | 2025-09-27 02:23:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:08.366853 | orchestrator | 2025-09-27 02:23:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:08.366961 | orchestrator | 2025-09-27 02:23:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:08.366978 | orchestrator | 2025-09-27 02:23:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:11.410082 | orchestrator | 2025-09-27 02:23:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:11.411754 | orchestrator | 2025-09-27 02:23:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:11.412015 | orchestrator | 2025-09-27 02:23:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:14.463302 | orchestrator | 2025-09-27 02:23:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:14.465500 | orchestrator | 2025-09-27 02:23:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:14.465596 | orchestrator | 2025-09-27 02:23:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:17.507093 | orchestrator | 2025-09-27 02:23:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:17.508703 | orchestrator | 2025-09-27 02:23:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:17.508796 | orchestrator | 2025-09-27 02:23:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:20.551738 | orchestrator | 2025-09-27 02:23:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:20.553658 | orchestrator | 2025-09-27 02:23:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:20.553775 | orchestrator | 2025-09-27 02:23:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:23.597152 | orchestrator | 2025-09-27 02:23:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:23.598995 | orchestrator | 2025-09-27 02:23:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:23.599206 | orchestrator | 2025-09-27 02:23:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:26.650349 | orchestrator | 2025-09-27 02:23:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:26.652180 | orchestrator | 2025-09-27 02:23:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:26.652355 | orchestrator | 2025-09-27 02:23:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:29.698613 | orchestrator | 2025-09-27 02:23:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:29.700348 | orchestrator | 2025-09-27 02:23:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:29.700379 | orchestrator | 2025-09-27 02:23:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:32.742218 | orchestrator | 2025-09-27 02:23:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:32.744133 | orchestrator | 2025-09-27 02:23:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:32.744165 | orchestrator | 2025-09-27 02:23:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:35.792805 | orchestrator | 2025-09-27 02:23:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:35.793726 | orchestrator | 2025-09-27 02:23:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:35.794066 | orchestrator | 2025-09-27 02:23:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:38.839788 | orchestrator | 2025-09-27 02:23:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:38.841632 | orchestrator | 2025-09-27 02:23:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:38.841668 | orchestrator | 2025-09-27 02:23:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:41.888687 | orchestrator | 2025-09-27 02:23:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:41.890863 | orchestrator | 2025-09-27 02:23:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:41.890897 | orchestrator | 2025-09-27 02:23:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:44.942807 | orchestrator | 2025-09-27 02:23:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:44.944823 | orchestrator | 2025-09-27 02:23:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:44.944871 | orchestrator | 2025-09-27 02:23:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:47.990719 | orchestrator | 2025-09-27 02:23:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:47.992469 | orchestrator | 2025-09-27 02:23:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:47.992560 | orchestrator | 2025-09-27 02:23:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:51.043344 | orchestrator | 2025-09-27 02:23:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:51.045086 | orchestrator | 2025-09-27 02:23:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:51.045117 | orchestrator | 2025-09-27 02:23:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:54.090530 | orchestrator | 2025-09-27 02:23:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:54.092568 | orchestrator | 2025-09-27 02:23:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:54.092872 | orchestrator | 2025-09-27 02:23:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:23:57.138549 | orchestrator | 2025-09-27 02:23:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:23:57.140332 | orchestrator | 2025-09-27 02:23:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:23:57.140370 | orchestrator | 2025-09-27 02:23:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:00.191070 | orchestrator | 2025-09-27 02:24:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:00.192837 | orchestrator | 2025-09-27 02:24:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:00.192911 | orchestrator | 2025-09-27 02:24:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:03.238556 | orchestrator | 2025-09-27 02:24:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:03.240444 | orchestrator | 2025-09-27 02:24:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:03.240474 | orchestrator | 2025-09-27 02:24:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:06.286192 | orchestrator | 2025-09-27 02:24:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:06.287158 | orchestrator | 2025-09-27 02:24:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:06.287226 | orchestrator | 2025-09-27 02:24:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:09.331884 | orchestrator | 2025-09-27 02:24:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:09.334015 | orchestrator | 2025-09-27 02:24:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:09.334103 | orchestrator | 2025-09-27 02:24:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:12.386425 | orchestrator | 2025-09-27 02:24:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:12.388349 | orchestrator | 2025-09-27 02:24:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:12.388604 | orchestrator | 2025-09-27 02:24:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:15.440057 | orchestrator | 2025-09-27 02:24:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:15.441046 | orchestrator | 2025-09-27 02:24:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:15.441125 | orchestrator | 2025-09-27 02:24:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:18.490972 | orchestrator | 2025-09-27 02:24:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:18.494111 | orchestrator | 2025-09-27 02:24:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:18.494145 | orchestrator | 2025-09-27 02:24:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:21.535015 | orchestrator | 2025-09-27 02:24:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:21.536121 | orchestrator | 2025-09-27 02:24:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:21.536365 | orchestrator | 2025-09-27 02:24:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:24.586741 | orchestrator | 2025-09-27 02:24:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:24.589155 | orchestrator | 2025-09-27 02:24:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:24.589193 | orchestrator | 2025-09-27 02:24:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:27.635528 | orchestrator | 2025-09-27 02:24:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:27.637619 | orchestrator | 2025-09-27 02:24:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:27.637711 | orchestrator | 2025-09-27 02:24:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:30.682226 | orchestrator | 2025-09-27 02:24:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:30.684187 | orchestrator | 2025-09-27 02:24:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:30.684217 | orchestrator | 2025-09-27 02:24:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:33.727448 | orchestrator | 2025-09-27 02:24:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:33.729837 | orchestrator | 2025-09-27 02:24:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:33.729870 | orchestrator | 2025-09-27 02:24:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:36.778910 | orchestrator | 2025-09-27 02:24:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:36.780538 | orchestrator | 2025-09-27 02:24:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:36.780682 | orchestrator | 2025-09-27 02:24:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:39.827485 | orchestrator | 2025-09-27 02:24:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:39.829994 | orchestrator | 2025-09-27 02:24:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:39.830096 | orchestrator | 2025-09-27 02:24:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:42.877904 | orchestrator | 2025-09-27 02:24:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:42.879826 | orchestrator | 2025-09-27 02:24:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:42.879930 | orchestrator | 2025-09-27 02:24:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:45.925341 | orchestrator | 2025-09-27 02:24:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:45.926506 | orchestrator | 2025-09-27 02:24:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:45.926800 | orchestrator | 2025-09-27 02:24:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:48.978163 | orchestrator | 2025-09-27 02:24:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:48.979393 | orchestrator | 2025-09-27 02:24:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:48.979955 | orchestrator | 2025-09-27 02:24:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:52.025931 | orchestrator | 2025-09-27 02:24:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:52.027916 | orchestrator | 2025-09-27 02:24:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:52.027944 | orchestrator | 2025-09-27 02:24:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:55.068156 | orchestrator | 2025-09-27 02:24:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:55.069432 | orchestrator | 2025-09-27 02:24:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:55.069498 | orchestrator | 2025-09-27 02:24:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:24:58.114988 | orchestrator | 2025-09-27 02:24:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:24:58.116576 | orchestrator | 2025-09-27 02:24:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:24:58.116607 | orchestrator | 2025-09-27 02:24:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:01.164699 | orchestrator | 2025-09-27 02:25:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:01.166215 | orchestrator | 2025-09-27 02:25:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:01.166322 | orchestrator | 2025-09-27 02:25:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:04.216423 | orchestrator | 2025-09-27 02:25:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:04.218206 | orchestrator | 2025-09-27 02:25:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:04.218239 | orchestrator | 2025-09-27 02:25:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:07.268223 | orchestrator | 2025-09-27 02:25:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:07.271765 | orchestrator | 2025-09-27 02:25:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:07.271894 | orchestrator | 2025-09-27 02:25:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:10.315068 | orchestrator | 2025-09-27 02:25:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:10.316336 | orchestrator | 2025-09-27 02:25:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:10.316369 | orchestrator | 2025-09-27 02:25:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:13.366109 | orchestrator | 2025-09-27 02:25:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:13.368805 | orchestrator | 2025-09-27 02:25:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:13.368846 | orchestrator | 2025-09-27 02:25:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:16.420181 | orchestrator | 2025-09-27 02:25:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:16.421218 | orchestrator | 2025-09-27 02:25:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:16.421369 | orchestrator | 2025-09-27 02:25:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:19.472366 | orchestrator | 2025-09-27 02:25:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:19.473806 | orchestrator | 2025-09-27 02:25:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:19.473842 | orchestrator | 2025-09-27 02:25:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:22.520204 | orchestrator | 2025-09-27 02:25:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:22.521623 | orchestrator | 2025-09-27 02:25:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:22.521657 | orchestrator | 2025-09-27 02:25:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:25.574394 | orchestrator | 2025-09-27 02:25:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:25.574813 | orchestrator | 2025-09-27 02:25:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:25.575057 | orchestrator | 2025-09-27 02:25:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:28.627159 | orchestrator | 2025-09-27 02:25:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:28.628299 | orchestrator | 2025-09-27 02:25:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:28.628353 | orchestrator | 2025-09-27 02:25:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:31.674635 | orchestrator | 2025-09-27 02:25:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:31.676196 | orchestrator | 2025-09-27 02:25:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:31.676226 | orchestrator | 2025-09-27 02:25:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:34.729329 | orchestrator | 2025-09-27 02:25:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:34.730473 | orchestrator | 2025-09-27 02:25:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:34.730520 | orchestrator | 2025-09-27 02:25:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:37.776815 | orchestrator | 2025-09-27 02:25:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:37.778399 | orchestrator | 2025-09-27 02:25:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:37.778494 | orchestrator | 2025-09-27 02:25:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:40.842887 | orchestrator | 2025-09-27 02:25:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:40.842990 | orchestrator | 2025-09-27 02:25:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:40.843005 | orchestrator | 2025-09-27 02:25:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:43.891895 | orchestrator | 2025-09-27 02:25:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:43.894377 | orchestrator | 2025-09-27 02:25:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:43.894415 | orchestrator | 2025-09-27 02:25:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:46.943161 | orchestrator | 2025-09-27 02:25:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:46.944560 | orchestrator | 2025-09-27 02:25:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:46.944704 | orchestrator | 2025-09-27 02:25:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:49.998252 | orchestrator | 2025-09-27 02:25:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:50.000254 | orchestrator | 2025-09-27 02:25:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:50.000307 | orchestrator | 2025-09-27 02:25:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:53.056029 | orchestrator | 2025-09-27 02:25:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:53.056134 | orchestrator | 2025-09-27 02:25:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:53.056169 | orchestrator | 2025-09-27 02:25:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:56.101469 | orchestrator | 2025-09-27 02:25:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:56.102549 | orchestrator | 2025-09-27 02:25:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:56.102727 | orchestrator | 2025-09-27 02:25:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:25:59.147481 | orchestrator | 2025-09-27 02:25:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:25:59.148741 | orchestrator | 2025-09-27 02:25:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:25:59.148945 | orchestrator | 2025-09-27 02:25:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:02.202447 | orchestrator | 2025-09-27 02:26:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:02.203589 | orchestrator | 2025-09-27 02:26:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:02.203753 | orchestrator | 2025-09-27 02:26:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:05.266374 | orchestrator | 2025-09-27 02:26:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:05.266986 | orchestrator | 2025-09-27 02:26:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:05.267534 | orchestrator | 2025-09-27 02:26:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:08.324671 | orchestrator | 2025-09-27 02:26:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:08.326511 | orchestrator | 2025-09-27 02:26:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:08.326801 | orchestrator | 2025-09-27 02:26:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:11.382137 | orchestrator | 2025-09-27 02:26:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:11.383866 | orchestrator | 2025-09-27 02:26:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:11.383894 | orchestrator | 2025-09-27 02:26:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:14.434323 | orchestrator | 2025-09-27 02:26:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:14.437919 | orchestrator | 2025-09-27 02:26:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:14.437946 | orchestrator | 2025-09-27 02:26:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:17.513005 | orchestrator | 2025-09-27 02:26:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:17.514211 | orchestrator | 2025-09-27 02:26:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:17.514239 | orchestrator | 2025-09-27 02:26:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:20.569136 | orchestrator | 2025-09-27 02:26:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:20.570102 | orchestrator | 2025-09-27 02:26:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:20.570171 | orchestrator | 2025-09-27 02:26:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:23.630923 | orchestrator | 2025-09-27 02:26:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:23.632623 | orchestrator | 2025-09-27 02:26:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:23.632838 | orchestrator | 2025-09-27 02:26:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:26.687094 | orchestrator | 2025-09-27 02:26:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:26.689367 | orchestrator | 2025-09-27 02:26:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:26.689401 | orchestrator | 2025-09-27 02:26:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:29.739775 | orchestrator | 2025-09-27 02:26:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:29.742527 | orchestrator | 2025-09-27 02:26:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:29.742813 | orchestrator | 2025-09-27 02:26:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:32.798734 | orchestrator | 2025-09-27 02:26:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:32.801490 | orchestrator | 2025-09-27 02:26:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:32.801989 | orchestrator | 2025-09-27 02:26:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:35.852994 | orchestrator | 2025-09-27 02:26:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:35.854196 | orchestrator | 2025-09-27 02:26:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:35.854482 | orchestrator | 2025-09-27 02:26:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:38.900721 | orchestrator | 2025-09-27 02:26:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:38.902890 | orchestrator | 2025-09-27 02:26:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:38.903049 | orchestrator | 2025-09-27 02:26:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:41.955083 | orchestrator | 2025-09-27 02:26:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:41.956635 | orchestrator | 2025-09-27 02:26:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:41.956664 | orchestrator | 2025-09-27 02:26:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:45.012932 | orchestrator | 2025-09-27 02:26:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:45.014721 | orchestrator | 2025-09-27 02:26:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:45.015003 | orchestrator | 2025-09-27 02:26:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:48.068562 | orchestrator | 2025-09-27 02:26:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:48.070107 | orchestrator | 2025-09-27 02:26:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:48.070146 | orchestrator | 2025-09-27 02:26:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:51.124371 | orchestrator | 2025-09-27 02:26:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:51.125654 | orchestrator | 2025-09-27 02:26:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:51.125683 | orchestrator | 2025-09-27 02:26:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:54.175900 | orchestrator | 2025-09-27 02:26:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:54.176884 | orchestrator | 2025-09-27 02:26:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:54.176911 | orchestrator | 2025-09-27 02:26:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:26:57.232481 | orchestrator | 2025-09-27 02:26:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:26:57.234508 | orchestrator | 2025-09-27 02:26:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:26:57.234543 | orchestrator | 2025-09-27 02:26:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:00.289357 | orchestrator | 2025-09-27 02:27:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:00.290536 | orchestrator | 2025-09-27 02:27:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:00.290810 | orchestrator | 2025-09-27 02:27:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:03.344848 | orchestrator | 2025-09-27 02:27:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:03.348640 | orchestrator | 2025-09-27 02:27:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:03.349125 | orchestrator | 2025-09-27 02:27:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:06.401384 | orchestrator | 2025-09-27 02:27:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:06.401985 | orchestrator | 2025-09-27 02:27:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:06.402200 | orchestrator | 2025-09-27 02:27:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:09.451760 | orchestrator | 2025-09-27 02:27:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:09.455910 | orchestrator | 2025-09-27 02:27:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:09.455936 | orchestrator | 2025-09-27 02:27:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:12.497386 | orchestrator | 2025-09-27 02:27:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:12.498762 | orchestrator | 2025-09-27 02:27:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:12.499437 | orchestrator | 2025-09-27 02:27:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:15.549543 | orchestrator | 2025-09-27 02:27:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:15.551050 | orchestrator | 2025-09-27 02:27:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:15.551126 | orchestrator | 2025-09-27 02:27:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:18.603973 | orchestrator | 2025-09-27 02:27:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:18.606202 | orchestrator | 2025-09-27 02:27:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:18.606252 | orchestrator | 2025-09-27 02:27:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:21.649834 | orchestrator | 2025-09-27 02:27:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:21.651442 | orchestrator | 2025-09-27 02:27:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:21.651542 | orchestrator | 2025-09-27 02:27:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:24.701664 | orchestrator | 2025-09-27 02:27:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:24.703388 | orchestrator | 2025-09-27 02:27:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:24.703420 | orchestrator | 2025-09-27 02:27:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:27.750826 | orchestrator | 2025-09-27 02:27:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:27.751898 | orchestrator | 2025-09-27 02:27:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:27.752116 | orchestrator | 2025-09-27 02:27:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:30.800963 | orchestrator | 2025-09-27 02:27:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:30.801062 | orchestrator | 2025-09-27 02:27:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:30.801139 | orchestrator | 2025-09-27 02:27:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:33.848990 | orchestrator | 2025-09-27 02:27:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:33.851441 | orchestrator | 2025-09-27 02:27:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:33.851473 | orchestrator | 2025-09-27 02:27:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:36.897656 | orchestrator | 2025-09-27 02:27:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:36.899045 | orchestrator | 2025-09-27 02:27:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:36.899440 | orchestrator | 2025-09-27 02:27:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:39.953693 | orchestrator | 2025-09-27 02:27:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:39.954394 | orchestrator | 2025-09-27 02:27:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:39.954715 | orchestrator | 2025-09-27 02:27:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:43.001997 | orchestrator | 2025-09-27 02:27:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:43.003558 | orchestrator | 2025-09-27 02:27:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:43.003589 | orchestrator | 2025-09-27 02:27:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:46.052204 | orchestrator | 2025-09-27 02:27:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:46.054165 | orchestrator | 2025-09-27 02:27:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:46.054269 | orchestrator | 2025-09-27 02:27:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:49.097616 | orchestrator | 2025-09-27 02:27:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:49.098988 | orchestrator | 2025-09-27 02:27:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:49.099063 | orchestrator | 2025-09-27 02:27:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:52.142774 | orchestrator | 2025-09-27 02:27:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:52.144606 | orchestrator | 2025-09-27 02:27:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:52.144732 | orchestrator | 2025-09-27 02:27:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:55.193161 | orchestrator | 2025-09-27 02:27:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:55.194707 | orchestrator | 2025-09-27 02:27:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:55.194786 | orchestrator | 2025-09-27 02:27:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:27:58.247736 | orchestrator | 2025-09-27 02:27:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:27:58.250202 | orchestrator | 2025-09-27 02:27:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:27:58.250231 | orchestrator | 2025-09-27 02:27:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:01.304685 | orchestrator | 2025-09-27 02:28:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:01.305487 | orchestrator | 2025-09-27 02:28:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:01.305515 | orchestrator | 2025-09-27 02:28:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:04.360201 | orchestrator | 2025-09-27 02:28:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:04.362176 | orchestrator | 2025-09-27 02:28:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:04.362454 | orchestrator | 2025-09-27 02:28:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:07.408800 | orchestrator | 2025-09-27 02:28:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:07.410923 | orchestrator | 2025-09-27 02:28:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:07.410941 | orchestrator | 2025-09-27 02:28:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:10.454243 | orchestrator | 2025-09-27 02:28:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:10.455156 | orchestrator | 2025-09-27 02:28:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:10.455344 | orchestrator | 2025-09-27 02:28:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:13.502266 | orchestrator | 2025-09-27 02:28:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:13.503583 | orchestrator | 2025-09-27 02:28:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:13.503759 | orchestrator | 2025-09-27 02:28:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:16.551985 | orchestrator | 2025-09-27 02:28:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:16.553990 | orchestrator | 2025-09-27 02:28:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:16.554118 | orchestrator | 2025-09-27 02:28:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:19.609096 | orchestrator | 2025-09-27 02:28:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:19.611110 | orchestrator | 2025-09-27 02:28:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:19.611125 | orchestrator | 2025-09-27 02:28:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:22.656097 | orchestrator | 2025-09-27 02:28:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:22.657247 | orchestrator | 2025-09-27 02:28:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:22.657315 | orchestrator | 2025-09-27 02:28:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:25.704146 | orchestrator | 2025-09-27 02:28:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:25.706323 | orchestrator | 2025-09-27 02:28:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:25.706352 | orchestrator | 2025-09-27 02:28:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:28.751596 | orchestrator | 2025-09-27 02:28:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:28.753608 | orchestrator | 2025-09-27 02:28:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:28.753687 | orchestrator | 2025-09-27 02:28:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:31.802811 | orchestrator | 2025-09-27 02:28:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:31.804461 | orchestrator | 2025-09-27 02:28:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:31.804490 | orchestrator | 2025-09-27 02:28:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:34.852972 | orchestrator | 2025-09-27 02:28:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:34.855274 | orchestrator | 2025-09-27 02:28:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:34.855329 | orchestrator | 2025-09-27 02:28:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:37.905452 | orchestrator | 2025-09-27 02:28:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:37.906511 | orchestrator | 2025-09-27 02:28:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:37.906556 | orchestrator | 2025-09-27 02:28:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:40.958788 | orchestrator | 2025-09-27 02:28:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:40.960203 | orchestrator | 2025-09-27 02:28:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:40.960278 | orchestrator | 2025-09-27 02:28:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:44.008215 | orchestrator | 2025-09-27 02:28:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:44.010396 | orchestrator | 2025-09-27 02:28:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:44.010429 | orchestrator | 2025-09-27 02:28:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:47.064076 | orchestrator | 2025-09-27 02:28:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:47.066909 | orchestrator | 2025-09-27 02:28:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:47.066940 | orchestrator | 2025-09-27 02:28:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:50.122868 | orchestrator | 2025-09-27 02:28:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:50.123046 | orchestrator | 2025-09-27 02:28:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:50.123067 | orchestrator | 2025-09-27 02:28:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:53.169129 | orchestrator | 2025-09-27 02:28:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:53.170554 | orchestrator | 2025-09-27 02:28:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:53.170586 | orchestrator | 2025-09-27 02:28:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:56.208149 | orchestrator | 2025-09-27 02:28:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:56.209784 | orchestrator | 2025-09-27 02:28:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:56.209815 | orchestrator | 2025-09-27 02:28:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:28:59.250843 | orchestrator | 2025-09-27 02:28:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:28:59.253170 | orchestrator | 2025-09-27 02:28:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:28:59.253261 | orchestrator | 2025-09-27 02:28:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:02.305515 | orchestrator | 2025-09-27 02:29:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:02.306683 | orchestrator | 2025-09-27 02:29:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:02.306710 | orchestrator | 2025-09-27 02:29:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:05.355276 | orchestrator | 2025-09-27 02:29:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:05.357417 | orchestrator | 2025-09-27 02:29:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:05.357444 | orchestrator | 2025-09-27 02:29:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:08.410279 | orchestrator | 2025-09-27 02:29:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:08.412788 | orchestrator | 2025-09-27 02:29:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:08.412826 | orchestrator | 2025-09-27 02:29:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:11.468462 | orchestrator | 2025-09-27 02:29:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:11.468570 | orchestrator | 2025-09-27 02:29:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:11.468587 | orchestrator | 2025-09-27 02:29:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:14.508263 | orchestrator | 2025-09-27 02:29:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:14.510199 | orchestrator | 2025-09-27 02:29:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:14.510462 | orchestrator | 2025-09-27 02:29:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:17.556108 | orchestrator | 2025-09-27 02:29:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:17.557627 | orchestrator | 2025-09-27 02:29:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:17.557711 | orchestrator | 2025-09-27 02:29:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:20.605354 | orchestrator | 2025-09-27 02:29:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:20.606447 | orchestrator | 2025-09-27 02:29:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:20.606509 | orchestrator | 2025-09-27 02:29:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:23.654932 | orchestrator | 2025-09-27 02:29:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:23.657199 | orchestrator | 2025-09-27 02:29:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:23.657233 | orchestrator | 2025-09-27 02:29:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:26.702219 | orchestrator | 2025-09-27 02:29:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:26.704473 | orchestrator | 2025-09-27 02:29:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:26.704570 | orchestrator | 2025-09-27 02:29:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:29.753961 | orchestrator | 2025-09-27 02:29:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:29.755889 | orchestrator | 2025-09-27 02:29:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:29.755915 | orchestrator | 2025-09-27 02:29:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:32.800897 | orchestrator | 2025-09-27 02:29:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:32.802553 | orchestrator | 2025-09-27 02:29:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:32.802623 | orchestrator | 2025-09-27 02:29:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:35.848816 | orchestrator | 2025-09-27 02:29:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:35.850143 | orchestrator | 2025-09-27 02:29:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:35.850229 | orchestrator | 2025-09-27 02:29:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:38.898150 | orchestrator | 2025-09-27 02:29:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:38.900759 | orchestrator | 2025-09-27 02:29:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:38.900844 | orchestrator | 2025-09-27 02:29:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:41.944825 | orchestrator | 2025-09-27 02:29:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:41.946149 | orchestrator | 2025-09-27 02:29:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:41.946240 | orchestrator | 2025-09-27 02:29:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:44.990992 | orchestrator | 2025-09-27 02:29:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:44.993035 | orchestrator | 2025-09-27 02:29:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:44.993219 | orchestrator | 2025-09-27 02:29:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:48.038357 | orchestrator | 2025-09-27 02:29:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:48.040013 | orchestrator | 2025-09-27 02:29:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:48.040354 | orchestrator | 2025-09-27 02:29:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:51.084220 | orchestrator | 2025-09-27 02:29:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:51.085957 | orchestrator | 2025-09-27 02:29:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:51.086071 | orchestrator | 2025-09-27 02:29:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:54.133392 | orchestrator | 2025-09-27 02:29:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:54.136508 | orchestrator | 2025-09-27 02:29:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:54.136725 | orchestrator | 2025-09-27 02:29:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:29:57.184436 | orchestrator | 2025-09-27 02:29:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:29:57.187280 | orchestrator | 2025-09-27 02:29:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:29:57.187318 | orchestrator | 2025-09-27 02:29:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:00.236769 | orchestrator | 2025-09-27 02:30:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:00.237453 | orchestrator | 2025-09-27 02:30:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:00.237487 | orchestrator | 2025-09-27 02:30:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:03.283804 | orchestrator | 2025-09-27 02:30:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:03.284030 | orchestrator | 2025-09-27 02:30:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:03.284055 | orchestrator | 2025-09-27 02:30:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:06.334200 | orchestrator | 2025-09-27 02:30:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:06.336837 | orchestrator | 2025-09-27 02:30:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:06.336872 | orchestrator | 2025-09-27 02:30:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:09.384124 | orchestrator | 2025-09-27 02:30:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:09.385851 | orchestrator | 2025-09-27 02:30:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:09.386166 | orchestrator | 2025-09-27 02:30:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:12.431972 | orchestrator | 2025-09-27 02:30:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:12.434866 | orchestrator | 2025-09-27 02:30:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:12.435365 | orchestrator | 2025-09-27 02:30:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:15.476648 | orchestrator | 2025-09-27 02:30:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:15.477822 | orchestrator | 2025-09-27 02:30:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:15.477899 | orchestrator | 2025-09-27 02:30:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:18.525158 | orchestrator | 2025-09-27 02:30:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:18.527499 | orchestrator | 2025-09-27 02:30:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:18.527786 | orchestrator | 2025-09-27 02:30:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:21.578172 | orchestrator | 2025-09-27 02:30:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:21.580245 | orchestrator | 2025-09-27 02:30:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:21.580283 | orchestrator | 2025-09-27 02:30:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:24.624257 | orchestrator | 2025-09-27 02:30:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:24.624739 | orchestrator | 2025-09-27 02:30:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:24.624768 | orchestrator | 2025-09-27 02:30:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:27.675695 | orchestrator | 2025-09-27 02:30:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:27.676127 | orchestrator | 2025-09-27 02:30:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:27.676201 | orchestrator | 2025-09-27 02:30:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:30.720931 | orchestrator | 2025-09-27 02:30:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:30.722210 | orchestrator | 2025-09-27 02:30:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:30.722245 | orchestrator | 2025-09-27 02:30:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:33.768792 | orchestrator | 2025-09-27 02:30:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:33.770824 | orchestrator | 2025-09-27 02:30:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:33.770858 | orchestrator | 2025-09-27 02:30:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:36.809520 | orchestrator | 2025-09-27 02:30:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:36.811238 | orchestrator | 2025-09-27 02:30:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:36.811278 | orchestrator | 2025-09-27 02:30:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:39.858610 | orchestrator | 2025-09-27 02:30:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:39.860961 | orchestrator | 2025-09-27 02:30:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:39.861003 | orchestrator | 2025-09-27 02:30:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:42.906892 | orchestrator | 2025-09-27 02:30:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:42.909471 | orchestrator | 2025-09-27 02:30:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:42.909509 | orchestrator | 2025-09-27 02:30:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:45.956581 | orchestrator | 2025-09-27 02:30:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:45.957594 | orchestrator | 2025-09-27 02:30:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:45.958357 | orchestrator | 2025-09-27 02:30:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:49.008930 | orchestrator | 2025-09-27 02:30:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:49.010643 | orchestrator | 2025-09-27 02:30:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:49.010810 | orchestrator | 2025-09-27 02:30:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:52.062526 | orchestrator | 2025-09-27 02:30:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:52.062808 | orchestrator | 2025-09-27 02:30:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:52.062832 | orchestrator | 2025-09-27 02:30:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:55.110135 | orchestrator | 2025-09-27 02:30:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:55.110381 | orchestrator | 2025-09-27 02:30:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:55.110405 | orchestrator | 2025-09-27 02:30:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:30:58.155914 | orchestrator | 2025-09-27 02:30:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:30:58.157696 | orchestrator | 2025-09-27 02:30:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:30:58.157922 | orchestrator | 2025-09-27 02:30:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:01.209385 | orchestrator | 2025-09-27 02:31:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:01.210263 | orchestrator | 2025-09-27 02:31:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:01.210312 | orchestrator | 2025-09-27 02:31:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:04.257656 | orchestrator | 2025-09-27 02:31:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:04.258860 | orchestrator | 2025-09-27 02:31:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:04.258905 | orchestrator | 2025-09-27 02:31:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:07.312449 | orchestrator | 2025-09-27 02:31:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:07.313449 | orchestrator | 2025-09-27 02:31:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:07.313483 | orchestrator | 2025-09-27 02:31:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:10.360041 | orchestrator | 2025-09-27 02:31:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:10.361582 | orchestrator | 2025-09-27 02:31:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:10.361621 | orchestrator | 2025-09-27 02:31:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:13.406266 | orchestrator | 2025-09-27 02:31:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:13.408036 | orchestrator | 2025-09-27 02:31:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:13.408071 | orchestrator | 2025-09-27 02:31:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:16.451899 | orchestrator | 2025-09-27 02:31:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:16.454160 | orchestrator | 2025-09-27 02:31:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:16.454249 | orchestrator | 2025-09-27 02:31:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:19.500665 | orchestrator | 2025-09-27 02:31:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:19.501465 | orchestrator | 2025-09-27 02:31:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:19.501724 | orchestrator | 2025-09-27 02:31:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:22.541894 | orchestrator | 2025-09-27 02:31:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:22.542662 | orchestrator | 2025-09-27 02:31:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:22.542703 | orchestrator | 2025-09-27 02:31:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:25.585005 | orchestrator | 2025-09-27 02:31:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:25.586486 | orchestrator | 2025-09-27 02:31:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:25.586522 | orchestrator | 2025-09-27 02:31:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:28.639668 | orchestrator | 2025-09-27 02:31:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:28.640760 | orchestrator | 2025-09-27 02:31:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:28.640800 | orchestrator | 2025-09-27 02:31:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:31.689972 | orchestrator | 2025-09-27 02:31:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:31.692209 | orchestrator | 2025-09-27 02:31:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:31.692347 | orchestrator | 2025-09-27 02:31:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:34.740381 | orchestrator | 2025-09-27 02:31:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:34.741499 | orchestrator | 2025-09-27 02:31:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:34.741532 | orchestrator | 2025-09-27 02:31:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:37.791841 | orchestrator | 2025-09-27 02:31:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:37.792971 | orchestrator | 2025-09-27 02:31:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:37.793006 | orchestrator | 2025-09-27 02:31:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:40.830771 | orchestrator | 2025-09-27 02:31:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:40.832988 | orchestrator | 2025-09-27 02:31:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:40.833019 | orchestrator | 2025-09-27 02:31:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:43.875428 | orchestrator | 2025-09-27 02:31:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:43.876270 | orchestrator | 2025-09-27 02:31:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:43.876605 | orchestrator | 2025-09-27 02:31:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:46.925211 | orchestrator | 2025-09-27 02:31:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:46.926518 | orchestrator | 2025-09-27 02:31:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:46.926623 | orchestrator | 2025-09-27 02:31:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:49.972026 | orchestrator | 2025-09-27 02:31:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:49.973364 | orchestrator | 2025-09-27 02:31:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:49.973392 | orchestrator | 2025-09-27 02:31:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:53.021193 | orchestrator | 2025-09-27 02:31:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:53.022128 | orchestrator | 2025-09-27 02:31:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:53.022362 | orchestrator | 2025-09-27 02:31:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:56.059777 | orchestrator | 2025-09-27 02:31:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:56.061090 | orchestrator | 2025-09-27 02:31:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:56.061122 | orchestrator | 2025-09-27 02:31:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:31:59.106001 | orchestrator | 2025-09-27 02:31:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:31:59.107468 | orchestrator | 2025-09-27 02:31:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:31:59.107758 | orchestrator | 2025-09-27 02:31:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:02.153823 | orchestrator | 2025-09-27 02:32:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:02.155383 | orchestrator | 2025-09-27 02:32:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:02.155416 | orchestrator | 2025-09-27 02:32:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:05.200495 | orchestrator | 2025-09-27 02:32:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:05.202172 | orchestrator | 2025-09-27 02:32:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:05.202345 | orchestrator | 2025-09-27 02:32:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:08.254527 | orchestrator | 2025-09-27 02:32:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:08.256406 | orchestrator | 2025-09-27 02:32:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:08.256441 | orchestrator | 2025-09-27 02:32:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:11.306856 | orchestrator | 2025-09-27 02:32:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:11.308161 | orchestrator | 2025-09-27 02:32:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:11.308200 | orchestrator | 2025-09-27 02:32:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:14.363408 | orchestrator | 2025-09-27 02:32:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:14.364649 | orchestrator | 2025-09-27 02:32:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:14.364681 | orchestrator | 2025-09-27 02:32:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:17.415441 | orchestrator | 2025-09-27 02:32:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:17.419017 | orchestrator | 2025-09-27 02:32:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:17.419051 | orchestrator | 2025-09-27 02:32:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:20.473448 | orchestrator | 2025-09-27 02:32:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:20.473558 | orchestrator | 2025-09-27 02:32:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:20.473573 | orchestrator | 2025-09-27 02:32:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:23.521424 | orchestrator | 2025-09-27 02:32:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:23.523617 | orchestrator | 2025-09-27 02:32:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:23.523671 | orchestrator | 2025-09-27 02:32:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:26.569120 | orchestrator | 2025-09-27 02:32:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:26.570253 | orchestrator | 2025-09-27 02:32:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:26.570389 | orchestrator | 2025-09-27 02:32:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:29.616352 | orchestrator | 2025-09-27 02:32:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:29.618079 | orchestrator | 2025-09-27 02:32:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:29.618129 | orchestrator | 2025-09-27 02:32:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:32.657673 | orchestrator | 2025-09-27 02:32:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:32.658881 | orchestrator | 2025-09-27 02:32:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:32.658966 | orchestrator | 2025-09-27 02:32:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:35.701986 | orchestrator | 2025-09-27 02:32:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:35.703418 | orchestrator | 2025-09-27 02:32:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:35.703494 | orchestrator | 2025-09-27 02:32:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:38.751103 | orchestrator | 2025-09-27 02:32:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:38.752450 | orchestrator | 2025-09-27 02:32:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:38.752570 | orchestrator | 2025-09-27 02:32:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:41.804801 | orchestrator | 2025-09-27 02:32:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:41.805652 | orchestrator | 2025-09-27 02:32:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:41.805685 | orchestrator | 2025-09-27 02:32:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:44.849981 | orchestrator | 2025-09-27 02:32:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:44.851528 | orchestrator | 2025-09-27 02:32:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:44.851810 | orchestrator | 2025-09-27 02:32:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:47.896994 | orchestrator | 2025-09-27 02:32:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:47.898424 | orchestrator | 2025-09-27 02:32:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:47.898493 | orchestrator | 2025-09-27 02:32:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:50.944201 | orchestrator | 2025-09-27 02:32:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:50.946188 | orchestrator | 2025-09-27 02:32:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:50.946222 | orchestrator | 2025-09-27 02:32:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:53.986529 | orchestrator | 2025-09-27 02:32:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:53.988485 | orchestrator | 2025-09-27 02:32:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:53.988570 | orchestrator | 2025-09-27 02:32:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:32:57.040413 | orchestrator | 2025-09-27 02:32:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:32:57.041657 | orchestrator | 2025-09-27 02:32:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:32:57.041743 | orchestrator | 2025-09-27 02:32:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:00.091508 | orchestrator | 2025-09-27 02:33:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:00.094264 | orchestrator | 2025-09-27 02:33:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:00.094363 | orchestrator | 2025-09-27 02:33:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:03.143837 | orchestrator | 2025-09-27 02:33:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:03.146432 | orchestrator | 2025-09-27 02:33:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:03.146525 | orchestrator | 2025-09-27 02:33:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:06.191345 | orchestrator | 2025-09-27 02:33:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:06.192724 | orchestrator | 2025-09-27 02:33:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:06.192761 | orchestrator | 2025-09-27 02:33:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:09.241754 | orchestrator | 2025-09-27 02:33:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:09.243395 | orchestrator | 2025-09-27 02:33:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:09.243437 | orchestrator | 2025-09-27 02:33:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:12.295302 | orchestrator | 2025-09-27 02:33:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:12.295495 | orchestrator | 2025-09-27 02:33:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:12.295516 | orchestrator | 2025-09-27 02:33:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:15.345163 | orchestrator | 2025-09-27 02:33:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:15.345301 | orchestrator | 2025-09-27 02:33:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:15.345313 | orchestrator | 2025-09-27 02:33:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:18.388143 | orchestrator | 2025-09-27 02:33:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:18.389020 | orchestrator | 2025-09-27 02:33:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:18.389053 | orchestrator | 2025-09-27 02:33:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:21.434690 | orchestrator | 2025-09-27 02:33:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:21.435494 | orchestrator | 2025-09-27 02:33:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:21.435756 | orchestrator | 2025-09-27 02:33:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:24.482399 | orchestrator | 2025-09-27 02:33:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:24.483598 | orchestrator | 2025-09-27 02:33:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:24.483629 | orchestrator | 2025-09-27 02:33:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:27.539186 | orchestrator | 2025-09-27 02:33:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:27.539749 | orchestrator | 2025-09-27 02:33:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:27.539832 | orchestrator | 2025-09-27 02:33:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:30.585173 | orchestrator | 2025-09-27 02:33:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:30.585867 | orchestrator | 2025-09-27 02:33:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:30.585899 | orchestrator | 2025-09-27 02:33:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:33.631380 | orchestrator | 2025-09-27 02:33:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:33.632853 | orchestrator | 2025-09-27 02:33:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:33.632880 | orchestrator | 2025-09-27 02:33:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:36.679318 | orchestrator | 2025-09-27 02:33:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:36.681069 | orchestrator | 2025-09-27 02:33:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:36.681104 | orchestrator | 2025-09-27 02:33:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:39.741773 | orchestrator | 2025-09-27 02:33:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:39.743616 | orchestrator | 2025-09-27 02:33:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:39.743794 | orchestrator | 2025-09-27 02:33:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:42.786685 | orchestrator | 2025-09-27 02:33:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:42.787750 | orchestrator | 2025-09-27 02:33:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:42.787779 | orchestrator | 2025-09-27 02:33:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:45.829987 | orchestrator | 2025-09-27 02:33:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:45.831831 | orchestrator | 2025-09-27 02:33:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:45.831905 | orchestrator | 2025-09-27 02:33:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:48.882082 | orchestrator | 2025-09-27 02:33:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:48.884896 | orchestrator | 2025-09-27 02:33:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:48.884998 | orchestrator | 2025-09-27 02:33:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:51.938456 | orchestrator | 2025-09-27 02:33:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:51.940938 | orchestrator | 2025-09-27 02:33:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:51.941511 | orchestrator | 2025-09-27 02:33:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:55.007381 | orchestrator | 2025-09-27 02:33:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:55.008232 | orchestrator | 2025-09-27 02:33:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:55.008263 | orchestrator | 2025-09-27 02:33:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:33:58.054869 | orchestrator | 2025-09-27 02:33:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:33:58.056918 | orchestrator | 2025-09-27 02:33:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:33:58.057006 | orchestrator | 2025-09-27 02:33:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:01.101332 | orchestrator | 2025-09-27 02:34:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:01.104639 | orchestrator | 2025-09-27 02:34:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:01.104723 | orchestrator | 2025-09-27 02:34:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:04.156693 | orchestrator | 2025-09-27 02:34:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:04.160508 | orchestrator | 2025-09-27 02:34:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:04.160751 | orchestrator | 2025-09-27 02:34:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:07.207548 | orchestrator | 2025-09-27 02:34:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:07.209737 | orchestrator | 2025-09-27 02:34:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:07.209815 | orchestrator | 2025-09-27 02:34:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:10.252606 | orchestrator | 2025-09-27 02:34:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:10.254556 | orchestrator | 2025-09-27 02:34:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:10.254731 | orchestrator | 2025-09-27 02:34:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:13.297745 | orchestrator | 2025-09-27 02:34:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:13.299059 | orchestrator | 2025-09-27 02:34:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:13.299090 | orchestrator | 2025-09-27 02:34:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:16.340577 | orchestrator | 2025-09-27 02:34:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:16.342015 | orchestrator | 2025-09-27 02:34:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:16.342368 | orchestrator | 2025-09-27 02:34:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:19.387881 | orchestrator | 2025-09-27 02:34:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:19.389319 | orchestrator | 2025-09-27 02:34:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:19.389346 | orchestrator | 2025-09-27 02:34:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:22.438346 | orchestrator | 2025-09-27 02:34:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:22.440226 | orchestrator | 2025-09-27 02:34:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:22.440316 | orchestrator | 2025-09-27 02:34:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:25.482817 | orchestrator | 2025-09-27 02:34:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:25.483772 | orchestrator | 2025-09-27 02:34:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:25.483820 | orchestrator | 2025-09-27 02:34:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:28.528936 | orchestrator | 2025-09-27 02:34:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:28.530679 | orchestrator | 2025-09-27 02:34:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:28.530710 | orchestrator | 2025-09-27 02:34:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:31.575037 | orchestrator | 2025-09-27 02:34:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:31.576497 | orchestrator | 2025-09-27 02:34:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:31.576526 | orchestrator | 2025-09-27 02:34:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:34.616882 | orchestrator | 2025-09-27 02:34:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:34.617824 | orchestrator | 2025-09-27 02:34:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:34.617856 | orchestrator | 2025-09-27 02:34:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:37.664264 | orchestrator | 2025-09-27 02:34:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:37.666934 | orchestrator | 2025-09-27 02:34:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:37.667033 | orchestrator | 2025-09-27 02:34:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:40.718992 | orchestrator | 2025-09-27 02:34:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:40.722559 | orchestrator | 2025-09-27 02:34:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:40.722737 | orchestrator | 2025-09-27 02:34:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:43.766187 | orchestrator | 2025-09-27 02:34:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:43.767280 | orchestrator | 2025-09-27 02:34:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:43.767311 | orchestrator | 2025-09-27 02:34:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:46.814623 | orchestrator | 2025-09-27 02:34:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:46.815504 | orchestrator | 2025-09-27 02:34:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:46.815536 | orchestrator | 2025-09-27 02:34:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:49.864779 | orchestrator | 2025-09-27 02:34:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:49.867302 | orchestrator | 2025-09-27 02:34:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:49.867416 | orchestrator | 2025-09-27 02:34:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:52.912389 | orchestrator | 2025-09-27 02:34:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:52.915244 | orchestrator | 2025-09-27 02:34:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:52.915274 | orchestrator | 2025-09-27 02:34:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:55.959254 | orchestrator | 2025-09-27 02:34:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:55.961680 | orchestrator | 2025-09-27 02:34:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:55.961709 | orchestrator | 2025-09-27 02:34:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:34:59.006247 | orchestrator | 2025-09-27 02:34:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:34:59.006356 | orchestrator | 2025-09-27 02:34:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:34:59.006372 | orchestrator | 2025-09-27 02:34:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:02.053411 | orchestrator | 2025-09-27 02:35:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:02.054455 | orchestrator | 2025-09-27 02:35:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:02.054484 | orchestrator | 2025-09-27 02:35:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:05.097642 | orchestrator | 2025-09-27 02:35:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:05.098694 | orchestrator | 2025-09-27 02:35:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:05.098725 | orchestrator | 2025-09-27 02:35:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:08.145939 | orchestrator | 2025-09-27 02:35:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:08.147818 | orchestrator | 2025-09-27 02:35:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:08.147872 | orchestrator | 2025-09-27 02:35:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:11.194618 | orchestrator | 2025-09-27 02:35:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:11.195968 | orchestrator | 2025-09-27 02:35:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:11.196207 | orchestrator | 2025-09-27 02:35:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:14.246610 | orchestrator | 2025-09-27 02:35:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:14.248334 | orchestrator | 2025-09-27 02:35:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:14.248360 | orchestrator | 2025-09-27 02:35:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:17.295840 | orchestrator | 2025-09-27 02:35:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:17.302059 | orchestrator | 2025-09-27 02:35:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:17.302185 | orchestrator | 2025-09-27 02:35:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:20.344785 | orchestrator | 2025-09-27 02:35:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:20.346393 | orchestrator | 2025-09-27 02:35:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:20.346441 | orchestrator | 2025-09-27 02:35:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:23.387811 | orchestrator | 2025-09-27 02:35:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:23.390219 | orchestrator | 2025-09-27 02:35:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:23.390253 | orchestrator | 2025-09-27 02:35:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:26.436190 | orchestrator | 2025-09-27 02:35:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:26.437819 | orchestrator | 2025-09-27 02:35:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:26.437876 | orchestrator | 2025-09-27 02:35:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:29.482601 | orchestrator | 2025-09-27 02:35:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:29.483734 | orchestrator | 2025-09-27 02:35:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:29.483764 | orchestrator | 2025-09-27 02:35:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:32.533655 | orchestrator | 2025-09-27 02:35:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:32.535478 | orchestrator | 2025-09-27 02:35:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:32.535503 | orchestrator | 2025-09-27 02:35:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:35.579681 | orchestrator | 2025-09-27 02:35:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:35.581815 | orchestrator | 2025-09-27 02:35:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:35.581843 | orchestrator | 2025-09-27 02:35:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:38.621880 | orchestrator | 2025-09-27 02:35:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:38.622699 | orchestrator | 2025-09-27 02:35:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:38.622885 | orchestrator | 2025-09-27 02:35:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:41.665317 | orchestrator | 2025-09-27 02:35:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:41.667152 | orchestrator | 2025-09-27 02:35:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:41.667183 | orchestrator | 2025-09-27 02:35:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:44.718605 | orchestrator | 2025-09-27 02:35:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:44.719196 | orchestrator | 2025-09-27 02:35:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:44.719310 | orchestrator | 2025-09-27 02:35:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:47.765879 | orchestrator | 2025-09-27 02:35:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:47.767370 | orchestrator | 2025-09-27 02:35:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:47.767456 | orchestrator | 2025-09-27 02:35:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:50.811934 | orchestrator | 2025-09-27 02:35:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:50.813178 | orchestrator | 2025-09-27 02:35:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:50.813422 | orchestrator | 2025-09-27 02:35:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:53.862869 | orchestrator | 2025-09-27 02:35:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:53.865298 | orchestrator | 2025-09-27 02:35:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:53.865380 | orchestrator | 2025-09-27 02:35:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:56.905559 | orchestrator | 2025-09-27 02:35:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:56.907776 | orchestrator | 2025-09-27 02:35:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:56.907809 | orchestrator | 2025-09-27 02:35:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:35:59.954513 | orchestrator | 2025-09-27 02:35:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:35:59.955803 | orchestrator | 2025-09-27 02:35:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:35:59.955832 | orchestrator | 2025-09-27 02:35:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:03.003455 | orchestrator | 2025-09-27 02:36:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:03.005861 | orchestrator | 2025-09-27 02:36:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:03.006001 | orchestrator | 2025-09-27 02:36:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:06.054138 | orchestrator | 2025-09-27 02:36:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:06.055305 | orchestrator | 2025-09-27 02:36:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:06.055339 | orchestrator | 2025-09-27 02:36:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:09.094806 | orchestrator | 2025-09-27 02:36:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:09.095823 | orchestrator | 2025-09-27 02:36:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:09.095957 | orchestrator | 2025-09-27 02:36:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:12.140335 | orchestrator | 2025-09-27 02:36:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:12.142529 | orchestrator | 2025-09-27 02:36:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:12.142573 | orchestrator | 2025-09-27 02:36:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:15.187309 | orchestrator | 2025-09-27 02:36:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:15.190354 | orchestrator | 2025-09-27 02:36:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:15.190392 | orchestrator | 2025-09-27 02:36:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:18.245015 | orchestrator | 2025-09-27 02:36:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:18.246195 | orchestrator | 2025-09-27 02:36:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:18.246301 | orchestrator | 2025-09-27 02:36:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:21.294211 | orchestrator | 2025-09-27 02:36:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:21.295422 | orchestrator | 2025-09-27 02:36:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:21.295513 | orchestrator | 2025-09-27 02:36:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:24.340908 | orchestrator | 2025-09-27 02:36:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:24.342203 | orchestrator | 2025-09-27 02:36:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:24.342331 | orchestrator | 2025-09-27 02:36:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:27.387793 | orchestrator | 2025-09-27 02:36:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:27.389291 | orchestrator | 2025-09-27 02:36:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:27.389473 | orchestrator | 2025-09-27 02:36:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:30.434289 | orchestrator | 2025-09-27 02:36:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:30.436087 | orchestrator | 2025-09-27 02:36:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:30.436133 | orchestrator | 2025-09-27 02:36:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:33.483596 | orchestrator | 2025-09-27 02:36:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:33.484506 | orchestrator | 2025-09-27 02:36:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:33.484609 | orchestrator | 2025-09-27 02:36:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:36.531317 | orchestrator | 2025-09-27 02:36:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:36.532335 | orchestrator | 2025-09-27 02:36:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:36.532370 | orchestrator | 2025-09-27 02:36:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:39.580730 | orchestrator | 2025-09-27 02:36:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:39.581603 | orchestrator | 2025-09-27 02:36:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:39.581639 | orchestrator | 2025-09-27 02:36:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:42.626612 | orchestrator | 2025-09-27 02:36:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:42.628412 | orchestrator | 2025-09-27 02:36:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:42.628443 | orchestrator | 2025-09-27 02:36:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:45.679329 | orchestrator | 2025-09-27 02:36:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:45.680837 | orchestrator | 2025-09-27 02:36:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:45.680878 | orchestrator | 2025-09-27 02:36:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:48.730004 | orchestrator | 2025-09-27 02:36:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:48.730948 | orchestrator | 2025-09-27 02:36:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:48.730967 | orchestrator | 2025-09-27 02:36:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:51.775889 | orchestrator | 2025-09-27 02:36:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:51.778672 | orchestrator | 2025-09-27 02:36:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:51.778706 | orchestrator | 2025-09-27 02:36:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:54.826719 | orchestrator | 2025-09-27 02:36:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:54.828523 | orchestrator | 2025-09-27 02:36:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:54.828603 | orchestrator | 2025-09-27 02:36:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:36:57.875151 | orchestrator | 2025-09-27 02:36:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:36:57.876805 | orchestrator | 2025-09-27 02:36:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:36:57.876835 | orchestrator | 2025-09-27 02:36:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:00.923293 | orchestrator | 2025-09-27 02:37:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:00.924665 | orchestrator | 2025-09-27 02:37:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:00.924700 | orchestrator | 2025-09-27 02:37:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:03.968636 | orchestrator | 2025-09-27 02:37:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:03.970792 | orchestrator | 2025-09-27 02:37:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:03.971009 | orchestrator | 2025-09-27 02:37:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:07.018372 | orchestrator | 2025-09-27 02:37:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:07.020674 | orchestrator | 2025-09-27 02:37:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:07.020761 | orchestrator | 2025-09-27 02:37:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:10.067281 | orchestrator | 2025-09-27 02:37:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:10.068683 | orchestrator | 2025-09-27 02:37:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:10.068728 | orchestrator | 2025-09-27 02:37:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:13.114315 | orchestrator | 2025-09-27 02:37:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:13.114795 | orchestrator | 2025-09-27 02:37:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:13.114853 | orchestrator | 2025-09-27 02:37:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:16.167313 | orchestrator | 2025-09-27 02:37:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:16.173675 | orchestrator | 2025-09-27 02:37:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:16.173710 | orchestrator | 2025-09-27 02:37:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:19.219390 | orchestrator | 2025-09-27 02:37:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:19.221067 | orchestrator | 2025-09-27 02:37:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:19.221094 | orchestrator | 2025-09-27 02:37:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:22.267627 | orchestrator | 2025-09-27 02:37:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:22.269559 | orchestrator | 2025-09-27 02:37:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:22.269632 | orchestrator | 2025-09-27 02:37:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:25.311626 | orchestrator | 2025-09-27 02:37:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:25.312968 | orchestrator | 2025-09-27 02:37:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:25.313159 | orchestrator | 2025-09-27 02:37:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:28.358292 | orchestrator | 2025-09-27 02:37:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:28.358594 | orchestrator | 2025-09-27 02:37:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:28.358907 | orchestrator | 2025-09-27 02:37:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:31.399735 | orchestrator | 2025-09-27 02:37:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:31.401665 | orchestrator | 2025-09-27 02:37:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:31.401694 | orchestrator | 2025-09-27 02:37:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:34.450598 | orchestrator | 2025-09-27 02:37:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:34.452634 | orchestrator | 2025-09-27 02:37:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:34.452684 | orchestrator | 2025-09-27 02:37:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:37.500185 | orchestrator | 2025-09-27 02:37:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:37.501682 | orchestrator | 2025-09-27 02:37:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:37.501714 | orchestrator | 2025-09-27 02:37:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:40.546334 | orchestrator | 2025-09-27 02:37:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:40.548543 | orchestrator | 2025-09-27 02:37:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:40.548572 | orchestrator | 2025-09-27 02:37:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:43.594777 | orchestrator | 2025-09-27 02:37:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:43.596475 | orchestrator | 2025-09-27 02:37:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:43.596580 | orchestrator | 2025-09-27 02:37:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:46.638141 | orchestrator | 2025-09-27 02:37:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:46.640518 | orchestrator | 2025-09-27 02:37:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:46.640550 | orchestrator | 2025-09-27 02:37:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:49.682314 | orchestrator | 2025-09-27 02:37:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:49.683640 | orchestrator | 2025-09-27 02:37:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:49.683734 | orchestrator | 2025-09-27 02:37:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:52.730434 | orchestrator | 2025-09-27 02:37:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:52.733760 | orchestrator | 2025-09-27 02:37:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:52.733806 | orchestrator | 2025-09-27 02:37:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:55.778869 | orchestrator | 2025-09-27 02:37:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:55.780147 | orchestrator | 2025-09-27 02:37:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:55.780178 | orchestrator | 2025-09-27 02:37:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:37:58.827494 | orchestrator | 2025-09-27 02:37:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:37:58.828212 | orchestrator | 2025-09-27 02:37:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:37:58.828246 | orchestrator | 2025-09-27 02:37:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:01.874653 | orchestrator | 2025-09-27 02:38:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:01.876540 | orchestrator | 2025-09-27 02:38:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:01.876577 | orchestrator | 2025-09-27 02:38:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:04.923952 | orchestrator | 2025-09-27 02:38:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:04.924606 | orchestrator | 2025-09-27 02:38:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:04.924646 | orchestrator | 2025-09-27 02:38:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:07.967953 | orchestrator | 2025-09-27 02:38:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:07.969732 | orchestrator | 2025-09-27 02:38:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:07.969763 | orchestrator | 2025-09-27 02:38:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:11.014657 | orchestrator | 2025-09-27 02:38:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:11.015617 | orchestrator | 2025-09-27 02:38:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:11.015651 | orchestrator | 2025-09-27 02:38:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:14.058919 | orchestrator | 2025-09-27 02:38:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:14.060040 | orchestrator | 2025-09-27 02:38:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:14.060071 | orchestrator | 2025-09-27 02:38:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:17.109593 | orchestrator | 2025-09-27 02:38:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:17.113202 | orchestrator | 2025-09-27 02:38:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:17.113270 | orchestrator | 2025-09-27 02:38:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:20.156402 | orchestrator | 2025-09-27 02:38:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:20.158091 | orchestrator | 2025-09-27 02:38:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:20.158127 | orchestrator | 2025-09-27 02:38:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:23.207267 | orchestrator | 2025-09-27 02:38:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:23.208805 | orchestrator | 2025-09-27 02:38:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:23.208899 | orchestrator | 2025-09-27 02:38:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:26.250860 | orchestrator | 2025-09-27 02:38:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:26.253087 | orchestrator | 2025-09-27 02:38:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:26.253142 | orchestrator | 2025-09-27 02:38:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:29.291692 | orchestrator | 2025-09-27 02:38:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:29.292548 | orchestrator | 2025-09-27 02:38:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:29.292582 | orchestrator | 2025-09-27 02:38:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:32.332677 | orchestrator | 2025-09-27 02:38:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:32.333845 | orchestrator | 2025-09-27 02:38:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:32.333938 | orchestrator | 2025-09-27 02:38:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:35.376502 | orchestrator | 2025-09-27 02:38:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:35.377528 | orchestrator | 2025-09-27 02:38:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:35.377558 | orchestrator | 2025-09-27 02:38:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:38.423756 | orchestrator | 2025-09-27 02:38:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:38.425269 | orchestrator | 2025-09-27 02:38:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:38.425297 | orchestrator | 2025-09-27 02:38:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:41.469733 | orchestrator | 2025-09-27 02:38:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:41.470802 | orchestrator | 2025-09-27 02:38:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:41.470864 | orchestrator | 2025-09-27 02:38:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:44.517981 | orchestrator | 2025-09-27 02:38:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:44.519769 | orchestrator | 2025-09-27 02:38:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:44.519799 | orchestrator | 2025-09-27 02:38:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:47.566704 | orchestrator | 2025-09-27 02:38:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:47.568454 | orchestrator | 2025-09-27 02:38:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:47.568683 | orchestrator | 2025-09-27 02:38:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:50.607473 | orchestrator | 2025-09-27 02:38:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:50.609728 | orchestrator | 2025-09-27 02:38:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:50.609914 | orchestrator | 2025-09-27 02:38:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:53.658466 | orchestrator | 2025-09-27 02:38:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:53.659720 | orchestrator | 2025-09-27 02:38:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:53.659769 | orchestrator | 2025-09-27 02:38:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:56.705723 | orchestrator | 2025-09-27 02:38:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:56.707548 | orchestrator | 2025-09-27 02:38:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:56.707580 | orchestrator | 2025-09-27 02:38:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:38:59.750796 | orchestrator | 2025-09-27 02:38:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:38:59.752209 | orchestrator | 2025-09-27 02:38:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:38:59.752292 | orchestrator | 2025-09-27 02:38:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:02.796621 | orchestrator | 2025-09-27 02:39:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:02.798402 | orchestrator | 2025-09-27 02:39:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:02.798504 | orchestrator | 2025-09-27 02:39:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:05.844109 | orchestrator | 2025-09-27 02:39:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:05.845587 | orchestrator | 2025-09-27 02:39:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:05.845611 | orchestrator | 2025-09-27 02:39:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:08.898083 | orchestrator | 2025-09-27 02:39:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:08.899949 | orchestrator | 2025-09-27 02:39:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:08.899984 | orchestrator | 2025-09-27 02:39:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:11.953791 | orchestrator | 2025-09-27 02:39:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:11.955180 | orchestrator | 2025-09-27 02:39:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:11.955558 | orchestrator | 2025-09-27 02:39:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:15.003124 | orchestrator | 2025-09-27 02:39:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:15.006230 | orchestrator | 2025-09-27 02:39:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:15.006319 | orchestrator | 2025-09-27 02:39:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:18.052911 | orchestrator | 2025-09-27 02:39:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:18.053634 | orchestrator | 2025-09-27 02:39:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:18.053667 | orchestrator | 2025-09-27 02:39:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:21.099795 | orchestrator | 2025-09-27 02:39:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:21.101215 | orchestrator | 2025-09-27 02:39:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:21.101240 | orchestrator | 2025-09-27 02:39:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:24.148487 | orchestrator | 2025-09-27 02:39:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:24.149202 | orchestrator | 2025-09-27 02:39:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:24.149399 | orchestrator | 2025-09-27 02:39:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:27.196413 | orchestrator | 2025-09-27 02:39:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:27.198384 | orchestrator | 2025-09-27 02:39:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:27.198657 | orchestrator | 2025-09-27 02:39:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:30.241556 | orchestrator | 2025-09-27 02:39:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:30.242119 | orchestrator | 2025-09-27 02:39:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:30.242199 | orchestrator | 2025-09-27 02:39:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:33.288846 | orchestrator | 2025-09-27 02:39:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:33.290668 | orchestrator | 2025-09-27 02:39:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:33.290785 | orchestrator | 2025-09-27 02:39:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:36.333169 | orchestrator | 2025-09-27 02:39:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:36.334348 | orchestrator | 2025-09-27 02:39:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:36.334384 | orchestrator | 2025-09-27 02:39:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:39.383847 | orchestrator | 2025-09-27 02:39:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:39.385462 | orchestrator | 2025-09-27 02:39:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:39.385535 | orchestrator | 2025-09-27 02:39:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:42.425827 | orchestrator | 2025-09-27 02:39:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:42.427572 | orchestrator | 2025-09-27 02:39:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:42.427680 | orchestrator | 2025-09-27 02:39:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:45.478683 | orchestrator | 2025-09-27 02:39:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:45.480682 | orchestrator | 2025-09-27 02:39:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:45.480713 | orchestrator | 2025-09-27 02:39:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:48.524652 | orchestrator | 2025-09-27 02:39:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:48.525786 | orchestrator | 2025-09-27 02:39:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:48.525959 | orchestrator | 2025-09-27 02:39:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:51.567193 | orchestrator | 2025-09-27 02:39:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:51.569754 | orchestrator | 2025-09-27 02:39:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:51.569792 | orchestrator | 2025-09-27 02:39:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:54.622191 | orchestrator | 2025-09-27 02:39:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:54.623096 | orchestrator | 2025-09-27 02:39:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:54.623199 | orchestrator | 2025-09-27 02:39:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:39:57.670801 | orchestrator | 2025-09-27 02:39:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:39:57.672243 | orchestrator | 2025-09-27 02:39:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:39:57.672273 | orchestrator | 2025-09-27 02:39:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:00.722465 | orchestrator | 2025-09-27 02:40:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:00.723497 | orchestrator | 2025-09-27 02:40:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:00.723573 | orchestrator | 2025-09-27 02:40:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:03.769535 | orchestrator | 2025-09-27 02:40:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:03.771558 | orchestrator | 2025-09-27 02:40:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:03.771589 | orchestrator | 2025-09-27 02:40:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:06.816556 | orchestrator | 2025-09-27 02:40:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:06.818012 | orchestrator | 2025-09-27 02:40:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:06.818089 | orchestrator | 2025-09-27 02:40:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:09.866461 | orchestrator | 2025-09-27 02:40:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:09.868725 | orchestrator | 2025-09-27 02:40:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:09.868757 | orchestrator | 2025-09-27 02:40:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:12.913475 | orchestrator | 2025-09-27 02:40:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:12.915203 | orchestrator | 2025-09-27 02:40:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:12.915283 | orchestrator | 2025-09-27 02:40:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:15.962484 | orchestrator | 2025-09-27 02:40:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:15.964083 | orchestrator | 2025-09-27 02:40:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:15.964500 | orchestrator | 2025-09-27 02:40:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:19.011075 | orchestrator | 2025-09-27 02:40:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:19.012483 | orchestrator | 2025-09-27 02:40:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:19.013135 | orchestrator | 2025-09-27 02:40:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:22.053826 | orchestrator | 2025-09-27 02:40:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:22.055704 | orchestrator | 2025-09-27 02:40:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:22.055741 | orchestrator | 2025-09-27 02:40:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:25.104475 | orchestrator | 2025-09-27 02:40:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:25.104703 | orchestrator | 2025-09-27 02:40:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:25.104727 | orchestrator | 2025-09-27 02:40:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:28.148772 | orchestrator | 2025-09-27 02:40:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:28.150743 | orchestrator | 2025-09-27 02:40:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:28.150784 | orchestrator | 2025-09-27 02:40:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:31.197014 | orchestrator | 2025-09-27 02:40:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:31.199257 | orchestrator | 2025-09-27 02:40:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:31.199349 | orchestrator | 2025-09-27 02:40:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:34.242268 | orchestrator | 2025-09-27 02:40:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:34.244356 | orchestrator | 2025-09-27 02:40:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:34.244587 | orchestrator | 2025-09-27 02:40:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:37.285643 | orchestrator | 2025-09-27 02:40:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:37.288044 | orchestrator | 2025-09-27 02:40:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:37.288157 | orchestrator | 2025-09-27 02:40:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:40.336315 | orchestrator | 2025-09-27 02:40:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:40.338732 | orchestrator | 2025-09-27 02:40:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:40.338819 | orchestrator | 2025-09-27 02:40:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:43.390325 | orchestrator | 2025-09-27 02:40:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:43.394189 | orchestrator | 2025-09-27 02:40:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:43.394231 | orchestrator | 2025-09-27 02:40:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:46.432294 | orchestrator | 2025-09-27 02:40:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:46.433596 | orchestrator | 2025-09-27 02:40:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:46.433627 | orchestrator | 2025-09-27 02:40:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:49.481519 | orchestrator | 2025-09-27 02:40:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:49.482978 | orchestrator | 2025-09-27 02:40:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:49.483014 | orchestrator | 2025-09-27 02:40:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:52.530122 | orchestrator | 2025-09-27 02:40:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:52.531213 | orchestrator | 2025-09-27 02:40:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:52.531245 | orchestrator | 2025-09-27 02:40:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:55.581178 | orchestrator | 2025-09-27 02:40:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:55.583927 | orchestrator | 2025-09-27 02:40:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:55.584425 | orchestrator | 2025-09-27 02:40:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:40:58.627382 | orchestrator | 2025-09-27 02:40:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:40:58.628818 | orchestrator | 2025-09-27 02:40:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:40:58.628855 | orchestrator | 2025-09-27 02:40:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:01.676600 | orchestrator | 2025-09-27 02:41:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:01.678312 | orchestrator | 2025-09-27 02:41:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:01.678360 | orchestrator | 2025-09-27 02:41:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:04.722676 | orchestrator | 2025-09-27 02:41:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:04.723758 | orchestrator | 2025-09-27 02:41:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:04.723833 | orchestrator | 2025-09-27 02:41:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:07.770565 | orchestrator | 2025-09-27 02:41:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:07.772720 | orchestrator | 2025-09-27 02:41:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:07.772761 | orchestrator | 2025-09-27 02:41:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:10.818841 | orchestrator | 2025-09-27 02:41:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:10.820185 | orchestrator | 2025-09-27 02:41:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:10.820268 | orchestrator | 2025-09-27 02:41:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:13.866168 | orchestrator | 2025-09-27 02:41:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:13.867143 | orchestrator | 2025-09-27 02:41:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:13.867174 | orchestrator | 2025-09-27 02:41:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:16.914703 | orchestrator | 2025-09-27 02:41:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:16.916168 | orchestrator | 2025-09-27 02:41:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:16.916201 | orchestrator | 2025-09-27 02:41:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:19.963434 | orchestrator | 2025-09-27 02:41:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:19.965043 | orchestrator | 2025-09-27 02:41:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:19.965196 | orchestrator | 2025-09-27 02:41:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:23.011689 | orchestrator | 2025-09-27 02:41:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:23.013918 | orchestrator | 2025-09-27 02:41:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:23.013971 | orchestrator | 2025-09-27 02:41:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:26.057845 | orchestrator | 2025-09-27 02:41:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:26.059720 | orchestrator | 2025-09-27 02:41:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:26.059754 | orchestrator | 2025-09-27 02:41:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:29.103338 | orchestrator | 2025-09-27 02:41:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:29.106173 | orchestrator | 2025-09-27 02:41:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:29.106262 | orchestrator | 2025-09-27 02:41:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:32.156588 | orchestrator | 2025-09-27 02:41:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:32.156690 | orchestrator | 2025-09-27 02:41:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:32.157061 | orchestrator | 2025-09-27 02:41:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:35.203604 | orchestrator | 2025-09-27 02:41:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:35.205779 | orchestrator | 2025-09-27 02:41:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:35.205824 | orchestrator | 2025-09-27 02:41:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:38.255101 | orchestrator | 2025-09-27 02:41:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:38.258265 | orchestrator | 2025-09-27 02:41:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:38.258493 | orchestrator | 2025-09-27 02:41:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:41.314512 | orchestrator | 2025-09-27 02:41:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:41.317033 | orchestrator | 2025-09-27 02:41:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:41.317063 | orchestrator | 2025-09-27 02:41:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:44.366145 | orchestrator | 2025-09-27 02:41:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:44.367391 | orchestrator | 2025-09-27 02:41:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:44.367422 | orchestrator | 2025-09-27 02:41:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:47.416957 | orchestrator | 2025-09-27 02:41:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:47.418729 | orchestrator | 2025-09-27 02:41:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:47.418775 | orchestrator | 2025-09-27 02:41:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:50.465345 | orchestrator | 2025-09-27 02:41:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:50.466365 | orchestrator | 2025-09-27 02:41:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:50.466677 | orchestrator | 2025-09-27 02:41:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:53.510967 | orchestrator | 2025-09-27 02:41:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:53.514226 | orchestrator | 2025-09-27 02:41:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:53.514259 | orchestrator | 2025-09-27 02:41:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:56.557256 | orchestrator | 2025-09-27 02:41:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:56.559216 | orchestrator | 2025-09-27 02:41:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:56.559512 | orchestrator | 2025-09-27 02:41:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:41:59.604625 | orchestrator | 2025-09-27 02:41:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:41:59.606882 | orchestrator | 2025-09-27 02:41:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:41:59.606934 | orchestrator | 2025-09-27 02:41:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:02.650484 | orchestrator | 2025-09-27 02:42:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:02.651414 | orchestrator | 2025-09-27 02:42:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:02.651446 | orchestrator | 2025-09-27 02:42:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:05.699511 | orchestrator | 2025-09-27 02:42:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:05.700882 | orchestrator | 2025-09-27 02:42:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:05.700921 | orchestrator | 2025-09-27 02:42:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:08.746968 | orchestrator | 2025-09-27 02:42:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:08.749214 | orchestrator | 2025-09-27 02:42:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:08.749373 | orchestrator | 2025-09-27 02:42:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:11.794821 | orchestrator | 2025-09-27 02:42:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:11.796501 | orchestrator | 2025-09-27 02:42:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:11.796563 | orchestrator | 2025-09-27 02:42:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:14.843267 | orchestrator | 2025-09-27 02:42:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:14.845170 | orchestrator | 2025-09-27 02:42:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:14.845197 | orchestrator | 2025-09-27 02:42:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:17.889984 | orchestrator | 2025-09-27 02:42:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:17.892123 | orchestrator | 2025-09-27 02:42:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:17.892171 | orchestrator | 2025-09-27 02:42:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:20.935316 | orchestrator | 2025-09-27 02:42:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:20.936939 | orchestrator | 2025-09-27 02:42:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:20.936973 | orchestrator | 2025-09-27 02:42:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:23.976508 | orchestrator | 2025-09-27 02:42:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:23.978446 | orchestrator | 2025-09-27 02:42:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:23.978536 | orchestrator | 2025-09-27 02:42:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:27.030998 | orchestrator | 2025-09-27 02:42:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:27.032550 | orchestrator | 2025-09-27 02:42:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:27.032576 | orchestrator | 2025-09-27 02:42:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:30.077077 | orchestrator | 2025-09-27 02:42:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:30.078196 | orchestrator | 2025-09-27 02:42:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:30.078280 | orchestrator | 2025-09-27 02:42:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:33.124565 | orchestrator | 2025-09-27 02:42:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:33.126558 | orchestrator | 2025-09-27 02:42:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:33.126688 | orchestrator | 2025-09-27 02:42:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:36.172051 | orchestrator | 2025-09-27 02:42:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:36.173863 | orchestrator | 2025-09-27 02:42:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:36.173896 | orchestrator | 2025-09-27 02:42:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:39.221692 | orchestrator | 2025-09-27 02:42:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:39.224122 | orchestrator | 2025-09-27 02:42:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:39.224161 | orchestrator | 2025-09-27 02:42:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:42.273958 | orchestrator | 2025-09-27 02:42:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:42.275689 | orchestrator | 2025-09-27 02:42:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:42.275739 | orchestrator | 2025-09-27 02:42:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:45.318517 | orchestrator | 2025-09-27 02:42:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:45.320400 | orchestrator | 2025-09-27 02:42:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:45.320494 | orchestrator | 2025-09-27 02:42:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:48.369731 | orchestrator | 2025-09-27 02:42:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:48.370422 | orchestrator | 2025-09-27 02:42:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:48.370509 | orchestrator | 2025-09-27 02:42:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:51.420875 | orchestrator | 2025-09-27 02:42:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:51.422002 | orchestrator | 2025-09-27 02:42:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:51.422081 | orchestrator | 2025-09-27 02:42:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:54.473058 | orchestrator | 2025-09-27 02:42:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:54.474221 | orchestrator | 2025-09-27 02:42:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:54.474553 | orchestrator | 2025-09-27 02:42:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:42:57.519786 | orchestrator | 2025-09-27 02:42:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:42:57.521956 | orchestrator | 2025-09-27 02:42:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:42:57.521990 | orchestrator | 2025-09-27 02:42:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:00.570883 | orchestrator | 2025-09-27 02:43:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:00.572948 | orchestrator | 2025-09-27 02:43:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:00.573080 | orchestrator | 2025-09-27 02:43:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:03.618677 | orchestrator | 2025-09-27 02:43:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:03.620207 | orchestrator | 2025-09-27 02:43:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:03.620400 | orchestrator | 2025-09-27 02:43:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:06.666240 | orchestrator | 2025-09-27 02:43:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:06.668407 | orchestrator | 2025-09-27 02:43:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:06.668499 | orchestrator | 2025-09-27 02:43:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:09.710966 | orchestrator | 2025-09-27 02:43:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:09.712883 | orchestrator | 2025-09-27 02:43:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:09.712975 | orchestrator | 2025-09-27 02:43:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:12.758551 | orchestrator | 2025-09-27 02:43:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:12.760024 | orchestrator | 2025-09-27 02:43:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:12.760097 | orchestrator | 2025-09-27 02:43:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:15.801872 | orchestrator | 2025-09-27 02:43:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:15.803620 | orchestrator | 2025-09-27 02:43:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:15.803662 | orchestrator | 2025-09-27 02:43:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:18.854240 | orchestrator | 2025-09-27 02:43:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:18.856297 | orchestrator | 2025-09-27 02:43:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:18.856372 | orchestrator | 2025-09-27 02:43:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:21.901708 | orchestrator | 2025-09-27 02:43:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:21.903525 | orchestrator | 2025-09-27 02:43:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:21.903554 | orchestrator | 2025-09-27 02:43:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:24.953607 | orchestrator | 2025-09-27 02:43:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:24.956978 | orchestrator | 2025-09-27 02:43:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:24.957014 | orchestrator | 2025-09-27 02:43:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:28.008791 | orchestrator | 2025-09-27 02:43:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:28.009996 | orchestrator | 2025-09-27 02:43:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:28.010124 | orchestrator | 2025-09-27 02:43:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:31.052966 | orchestrator | 2025-09-27 02:43:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:31.055001 | orchestrator | 2025-09-27 02:43:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:31.055039 | orchestrator | 2025-09-27 02:43:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:34.099182 | orchestrator | 2025-09-27 02:43:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:34.099288 | orchestrator | 2025-09-27 02:43:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:34.099302 | orchestrator | 2025-09-27 02:43:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:37.145424 | orchestrator | 2025-09-27 02:43:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:37.145785 | orchestrator | 2025-09-27 02:43:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:37.146083 | orchestrator | 2025-09-27 02:43:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:40.187538 | orchestrator | 2025-09-27 02:43:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:40.187641 | orchestrator | 2025-09-27 02:43:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:40.187656 | orchestrator | 2025-09-27 02:43:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:43.237054 | orchestrator | 2025-09-27 02:43:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:43.239153 | orchestrator | 2025-09-27 02:43:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:43.239229 | orchestrator | 2025-09-27 02:43:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:46.284462 | orchestrator | 2025-09-27 02:43:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:46.285533 | orchestrator | 2025-09-27 02:43:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:46.285560 | orchestrator | 2025-09-27 02:43:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:49.326103 | orchestrator | 2025-09-27 02:43:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:49.328204 | orchestrator | 2025-09-27 02:43:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:49.328230 | orchestrator | 2025-09-27 02:43:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:52.373681 | orchestrator | 2025-09-27 02:43:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:52.375022 | orchestrator | 2025-09-27 02:43:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:52.375101 | orchestrator | 2025-09-27 02:43:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:55.419589 | orchestrator | 2025-09-27 02:43:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:55.421044 | orchestrator | 2025-09-27 02:43:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:55.421072 | orchestrator | 2025-09-27 02:43:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:43:58.461687 | orchestrator | 2025-09-27 02:43:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:43:58.462687 | orchestrator | 2025-09-27 02:43:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:43:58.462733 | orchestrator | 2025-09-27 02:43:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:01.508539 | orchestrator | 2025-09-27 02:44:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:01.511706 | orchestrator | 2025-09-27 02:44:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:01.511997 | orchestrator | 2025-09-27 02:44:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:04.559060 | orchestrator | 2025-09-27 02:44:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:04.560845 | orchestrator | 2025-09-27 02:44:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:04.560878 | orchestrator | 2025-09-27 02:44:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:07.605012 | orchestrator | 2025-09-27 02:44:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:07.607341 | orchestrator | 2025-09-27 02:44:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:07.607470 | orchestrator | 2025-09-27 02:44:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:10.651253 | orchestrator | 2025-09-27 02:44:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:10.653241 | orchestrator | 2025-09-27 02:44:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:10.653276 | orchestrator | 2025-09-27 02:44:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:13.705646 | orchestrator | 2025-09-27 02:44:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:13.707074 | orchestrator | 2025-09-27 02:44:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:13.707120 | orchestrator | 2025-09-27 02:44:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:16.745837 | orchestrator | 2025-09-27 02:44:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:16.746256 | orchestrator | 2025-09-27 02:44:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:16.746844 | orchestrator | 2025-09-27 02:44:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:19.790572 | orchestrator | 2025-09-27 02:44:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:19.791889 | orchestrator | 2025-09-27 02:44:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:19.792008 | orchestrator | 2025-09-27 02:44:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:22.837051 | orchestrator | 2025-09-27 02:44:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:22.838306 | orchestrator | 2025-09-27 02:44:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:22.838646 | orchestrator | 2025-09-27 02:44:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:25.886589 | orchestrator | 2025-09-27 02:44:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:25.887587 | orchestrator | 2025-09-27 02:44:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:25.887619 | orchestrator | 2025-09-27 02:44:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:28.940692 | orchestrator | 2025-09-27 02:44:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:28.941687 | orchestrator | 2025-09-27 02:44:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:28.941721 | orchestrator | 2025-09-27 02:44:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:31.988562 | orchestrator | 2025-09-27 02:44:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:31.990501 | orchestrator | 2025-09-27 02:44:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:31.990535 | orchestrator | 2025-09-27 02:44:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:35.035585 | orchestrator | 2025-09-27 02:44:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:35.036984 | orchestrator | 2025-09-27 02:44:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:35.037512 | orchestrator | 2025-09-27 02:44:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:38.078305 | orchestrator | 2025-09-27 02:44:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:38.079505 | orchestrator | 2025-09-27 02:44:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:38.079733 | orchestrator | 2025-09-27 02:44:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:41.123759 | orchestrator | 2025-09-27 02:44:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:41.126203 | orchestrator | 2025-09-27 02:44:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:41.126402 | orchestrator | 2025-09-27 02:44:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:44.177615 | orchestrator | 2025-09-27 02:44:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:44.179004 | orchestrator | 2025-09-27 02:44:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:44.179046 | orchestrator | 2025-09-27 02:44:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:47.231403 | orchestrator | 2025-09-27 02:44:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:47.233184 | orchestrator | 2025-09-27 02:44:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:47.233245 | orchestrator | 2025-09-27 02:44:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:50.283534 | orchestrator | 2025-09-27 02:44:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:50.284629 | orchestrator | 2025-09-27 02:44:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:50.284754 | orchestrator | 2025-09-27 02:44:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:53.327750 | orchestrator | 2025-09-27 02:44:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:53.329528 | orchestrator | 2025-09-27 02:44:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:53.329557 | orchestrator | 2025-09-27 02:44:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:56.372307 | orchestrator | 2025-09-27 02:44:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:56.374439 | orchestrator | 2025-09-27 02:44:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:56.374517 | orchestrator | 2025-09-27 02:44:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:44:59.416407 | orchestrator | 2025-09-27 02:44:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:44:59.417431 | orchestrator | 2025-09-27 02:44:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:44:59.417512 | orchestrator | 2025-09-27 02:44:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:02.469917 | orchestrator | 2025-09-27 02:45:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:02.471532 | orchestrator | 2025-09-27 02:45:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:02.471570 | orchestrator | 2025-09-27 02:45:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:05.516038 | orchestrator | 2025-09-27 02:45:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:05.517475 | orchestrator | 2025-09-27 02:45:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:05.517891 | orchestrator | 2025-09-27 02:45:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:08.564869 | orchestrator | 2025-09-27 02:45:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:08.566793 | orchestrator | 2025-09-27 02:45:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:08.566957 | orchestrator | 2025-09-27 02:45:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:11.615565 | orchestrator | 2025-09-27 02:45:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:11.616810 | orchestrator | 2025-09-27 02:45:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:11.616841 | orchestrator | 2025-09-27 02:45:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:14.668248 | orchestrator | 2025-09-27 02:45:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:14.668951 | orchestrator | 2025-09-27 02:45:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:14.669241 | orchestrator | 2025-09-27 02:45:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:17.716116 | orchestrator | 2025-09-27 02:45:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:17.717134 | orchestrator | 2025-09-27 02:45:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:17.717830 | orchestrator | 2025-09-27 02:45:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:20.761425 | orchestrator | 2025-09-27 02:45:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:20.763296 | orchestrator | 2025-09-27 02:45:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:20.763639 | orchestrator | 2025-09-27 02:45:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:23.804592 | orchestrator | 2025-09-27 02:45:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:23.805816 | orchestrator | 2025-09-27 02:45:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:23.805853 | orchestrator | 2025-09-27 02:45:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:26.853200 | orchestrator | 2025-09-27 02:45:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:26.853981 | orchestrator | 2025-09-27 02:45:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:26.854300 | orchestrator | 2025-09-27 02:45:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:29.904025 | orchestrator | 2025-09-27 02:45:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:29.907371 | orchestrator | 2025-09-27 02:45:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:29.907455 | orchestrator | 2025-09-27 02:45:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:32.956806 | orchestrator | 2025-09-27 02:45:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:32.958448 | orchestrator | 2025-09-27 02:45:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:32.958485 | orchestrator | 2025-09-27 02:45:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:36.004309 | orchestrator | 2025-09-27 02:45:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:36.004599 | orchestrator | 2025-09-27 02:45:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:36.004983 | orchestrator | 2025-09-27 02:45:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:39.050132 | orchestrator | 2025-09-27 02:45:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:39.051243 | orchestrator | 2025-09-27 02:45:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:39.051277 | orchestrator | 2025-09-27 02:45:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:42.100448 | orchestrator | 2025-09-27 02:45:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:42.103157 | orchestrator | 2025-09-27 02:45:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:42.103263 | orchestrator | 2025-09-27 02:45:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:45.149522 | orchestrator | 2025-09-27 02:45:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:45.150492 | orchestrator | 2025-09-27 02:45:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:45.150533 | orchestrator | 2025-09-27 02:45:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:48.195168 | orchestrator | 2025-09-27 02:45:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:48.197092 | orchestrator | 2025-09-27 02:45:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:48.197125 | orchestrator | 2025-09-27 02:45:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:51.234554 | orchestrator | 2025-09-27 02:45:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:51.235013 | orchestrator | 2025-09-27 02:45:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:51.235042 | orchestrator | 2025-09-27 02:45:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:54.279974 | orchestrator | 2025-09-27 02:45:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:54.281237 | orchestrator | 2025-09-27 02:45:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:54.281271 | orchestrator | 2025-09-27 02:45:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:45:57.323804 | orchestrator | 2025-09-27 02:45:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:45:57.324453 | orchestrator | 2025-09-27 02:45:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:45:57.324489 | orchestrator | 2025-09-27 02:45:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:00.372109 | orchestrator | 2025-09-27 02:46:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:00.373148 | orchestrator | 2025-09-27 02:46:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:00.373198 | orchestrator | 2025-09-27 02:46:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:03.421692 | orchestrator | 2025-09-27 02:46:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:03.423118 | orchestrator | 2025-09-27 02:46:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:03.423154 | orchestrator | 2025-09-27 02:46:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:06.467595 | orchestrator | 2025-09-27 02:46:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:06.469864 | orchestrator | 2025-09-27 02:46:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:06.469905 | orchestrator | 2025-09-27 02:46:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:09.515243 | orchestrator | 2025-09-27 02:46:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:09.517298 | orchestrator | 2025-09-27 02:46:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:09.517331 | orchestrator | 2025-09-27 02:46:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:12.561779 | orchestrator | 2025-09-27 02:46:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:12.563669 | orchestrator | 2025-09-27 02:46:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:12.563703 | orchestrator | 2025-09-27 02:46:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:15.612636 | orchestrator | 2025-09-27 02:46:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:15.614625 | orchestrator | 2025-09-27 02:46:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:15.614957 | orchestrator | 2025-09-27 02:46:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:18.668133 | orchestrator | 2025-09-27 02:46:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:18.669124 | orchestrator | 2025-09-27 02:46:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:18.669157 | orchestrator | 2025-09-27 02:46:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:21.712477 | orchestrator | 2025-09-27 02:46:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:21.713447 | orchestrator | 2025-09-27 02:46:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:21.713478 | orchestrator | 2025-09-27 02:46:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:24.759652 | orchestrator | 2025-09-27 02:46:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:24.762940 | orchestrator | 2025-09-27 02:46:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:24.763852 | orchestrator | 2025-09-27 02:46:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:27.810085 | orchestrator | 2025-09-27 02:46:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:27.812395 | orchestrator | 2025-09-27 02:46:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:27.812471 | orchestrator | 2025-09-27 02:46:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:30.861295 | orchestrator | 2025-09-27 02:46:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:30.863324 | orchestrator | 2025-09-27 02:46:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:30.863359 | orchestrator | 2025-09-27 02:46:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:33.915181 | orchestrator | 2025-09-27 02:46:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:33.916699 | orchestrator | 2025-09-27 02:46:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:33.916817 | orchestrator | 2025-09-27 02:46:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:36.961467 | orchestrator | 2025-09-27 02:46:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:36.962324 | orchestrator | 2025-09-27 02:46:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:36.962563 | orchestrator | 2025-09-27 02:46:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:40.014990 | orchestrator | 2025-09-27 02:46:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:40.015577 | orchestrator | 2025-09-27 02:46:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:40.015610 | orchestrator | 2025-09-27 02:46:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:43.070394 | orchestrator | 2025-09-27 02:46:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:43.072489 | orchestrator | 2025-09-27 02:46:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:43.072521 | orchestrator | 2025-09-27 02:46:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:46.118581 | orchestrator | 2025-09-27 02:46:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:46.121587 | orchestrator | 2025-09-27 02:46:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:46.121621 | orchestrator | 2025-09-27 02:46:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:49.175792 | orchestrator | 2025-09-27 02:46:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:49.177442 | orchestrator | 2025-09-27 02:46:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:49.177486 | orchestrator | 2025-09-27 02:46:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:52.225602 | orchestrator | 2025-09-27 02:46:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:52.226634 | orchestrator | 2025-09-27 02:46:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:52.226671 | orchestrator | 2025-09-27 02:46:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:55.275361 | orchestrator | 2025-09-27 02:46:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:55.276713 | orchestrator | 2025-09-27 02:46:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:55.277644 | orchestrator | 2025-09-27 02:46:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:46:58.324712 | orchestrator | 2025-09-27 02:46:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:46:58.325263 | orchestrator | 2025-09-27 02:46:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:46:58.325296 | orchestrator | 2025-09-27 02:46:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:01.371424 | orchestrator | 2025-09-27 02:47:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:01.372358 | orchestrator | 2025-09-27 02:47:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:01.372390 | orchestrator | 2025-09-27 02:47:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:04.422920 | orchestrator | 2025-09-27 02:47:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:04.424536 | orchestrator | 2025-09-27 02:47:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:04.424568 | orchestrator | 2025-09-27 02:47:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:07.467499 | orchestrator | 2025-09-27 02:47:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:07.470076 | orchestrator | 2025-09-27 02:47:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:07.470562 | orchestrator | 2025-09-27 02:47:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:10.511417 | orchestrator | 2025-09-27 02:47:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:10.512941 | orchestrator | 2025-09-27 02:47:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:10.513025 | orchestrator | 2025-09-27 02:47:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:13.562316 | orchestrator | 2025-09-27 02:47:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:13.564890 | orchestrator | 2025-09-27 02:47:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:13.564928 | orchestrator | 2025-09-27 02:47:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:16.609508 | orchestrator | 2025-09-27 02:47:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:16.611106 | orchestrator | 2025-09-27 02:47:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:16.611181 | orchestrator | 2025-09-27 02:47:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:19.674303 | orchestrator | 2025-09-27 02:47:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:19.676403 | orchestrator | 2025-09-27 02:47:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:19.676490 | orchestrator | 2025-09-27 02:47:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:22.725757 | orchestrator | 2025-09-27 02:47:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:22.726937 | orchestrator | 2025-09-27 02:47:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:22.726953 | orchestrator | 2025-09-27 02:47:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:25.781868 | orchestrator | 2025-09-27 02:47:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:25.782773 | orchestrator | 2025-09-27 02:47:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:25.782841 | orchestrator | 2025-09-27 02:47:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:28.829482 | orchestrator | 2025-09-27 02:47:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:28.832206 | orchestrator | 2025-09-27 02:47:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:28.832241 | orchestrator | 2025-09-27 02:47:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:31.880313 | orchestrator | 2025-09-27 02:47:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:31.881757 | orchestrator | 2025-09-27 02:47:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:31.882139 | orchestrator | 2025-09-27 02:47:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:34.933704 | orchestrator | 2025-09-27 02:47:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:34.934241 | orchestrator | 2025-09-27 02:47:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:34.934276 | orchestrator | 2025-09-27 02:47:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:37.973605 | orchestrator | 2025-09-27 02:47:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:37.974648 | orchestrator | 2025-09-27 02:47:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:37.974699 | orchestrator | 2025-09-27 02:47:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:41.021854 | orchestrator | 2025-09-27 02:47:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:41.023093 | orchestrator | 2025-09-27 02:47:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:41.023153 | orchestrator | 2025-09-27 02:47:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:44.072681 | orchestrator | 2025-09-27 02:47:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:44.074398 | orchestrator | 2025-09-27 02:47:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:44.074494 | orchestrator | 2025-09-27 02:47:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:47.120958 | orchestrator | 2025-09-27 02:47:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:47.122504 | orchestrator | 2025-09-27 02:47:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:47.122540 | orchestrator | 2025-09-27 02:47:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:50.179879 | orchestrator | 2025-09-27 02:47:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:50.180928 | orchestrator | 2025-09-27 02:47:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:50.180956 | orchestrator | 2025-09-27 02:47:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:53.224779 | orchestrator | 2025-09-27 02:47:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:53.226531 | orchestrator | 2025-09-27 02:47:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:53.226565 | orchestrator | 2025-09-27 02:47:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:56.268220 | orchestrator | 2025-09-27 02:47:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:56.270558 | orchestrator | 2025-09-27 02:47:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:56.270594 | orchestrator | 2025-09-27 02:47:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:47:59.319442 | orchestrator | 2025-09-27 02:47:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:47:59.321578 | orchestrator | 2025-09-27 02:47:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:47:59.321694 | orchestrator | 2025-09-27 02:47:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:02.368849 | orchestrator | 2025-09-27 02:48:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:02.371100 | orchestrator | 2025-09-27 02:48:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:02.371217 | orchestrator | 2025-09-27 02:48:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:05.418938 | orchestrator | 2025-09-27 02:48:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:05.420264 | orchestrator | 2025-09-27 02:48:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:05.420292 | orchestrator | 2025-09-27 02:48:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:08.463671 | orchestrator | 2025-09-27 02:48:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:08.465345 | orchestrator | 2025-09-27 02:48:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:08.465375 | orchestrator | 2025-09-27 02:48:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:11.511382 | orchestrator | 2025-09-27 02:48:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:11.512584 | orchestrator | 2025-09-27 02:48:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:11.512865 | orchestrator | 2025-09-27 02:48:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:14.559572 | orchestrator | 2025-09-27 02:48:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:14.564500 | orchestrator | 2025-09-27 02:48:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:14.564552 | orchestrator | 2025-09-27 02:48:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:17.609417 | orchestrator | 2025-09-27 02:48:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:17.610942 | orchestrator | 2025-09-27 02:48:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:17.611450 | orchestrator | 2025-09-27 02:48:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:20.661091 | orchestrator | 2025-09-27 02:48:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:20.662618 | orchestrator | 2025-09-27 02:48:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:20.662889 | orchestrator | 2025-09-27 02:48:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:23.706186 | orchestrator | 2025-09-27 02:48:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:23.706813 | orchestrator | 2025-09-27 02:48:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:23.706848 | orchestrator | 2025-09-27 02:48:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:26.753874 | orchestrator | 2025-09-27 02:48:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:26.756085 | orchestrator | 2025-09-27 02:48:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:26.756183 | orchestrator | 2025-09-27 02:48:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:29.800487 | orchestrator | 2025-09-27 02:48:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:29.802171 | orchestrator | 2025-09-27 02:48:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:29.802284 | orchestrator | 2025-09-27 02:48:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:32.847129 | orchestrator | 2025-09-27 02:48:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:32.849276 | orchestrator | 2025-09-27 02:48:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:32.849765 | orchestrator | 2025-09-27 02:48:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:35.897926 | orchestrator | 2025-09-27 02:48:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:35.900460 | orchestrator | 2025-09-27 02:48:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:35.900539 | orchestrator | 2025-09-27 02:48:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:38.942290 | orchestrator | 2025-09-27 02:48:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:38.944523 | orchestrator | 2025-09-27 02:48:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:38.944559 | orchestrator | 2025-09-27 02:48:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:41.989150 | orchestrator | 2025-09-27 02:48:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:41.989890 | orchestrator | 2025-09-27 02:48:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:41.990152 | orchestrator | 2025-09-27 02:48:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:45.032151 | orchestrator | 2025-09-27 02:48:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:45.033380 | orchestrator | 2025-09-27 02:48:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:45.033799 | orchestrator | 2025-09-27 02:48:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:48.075215 | orchestrator | 2025-09-27 02:48:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:48.077707 | orchestrator | 2025-09-27 02:48:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:48.077738 | orchestrator | 2025-09-27 02:48:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:51.129009 | orchestrator | 2025-09-27 02:48:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:51.130917 | orchestrator | 2025-09-27 02:48:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:51.130959 | orchestrator | 2025-09-27 02:48:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:54.183312 | orchestrator | 2025-09-27 02:48:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:54.185309 | orchestrator | 2025-09-27 02:48:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:54.185719 | orchestrator | 2025-09-27 02:48:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:48:57.231513 | orchestrator | 2025-09-27 02:48:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:48:57.232773 | orchestrator | 2025-09-27 02:48:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:48:57.232807 | orchestrator | 2025-09-27 02:48:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:00.278920 | orchestrator | 2025-09-27 02:49:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:00.280522 | orchestrator | 2025-09-27 02:49:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:00.280630 | orchestrator | 2025-09-27 02:49:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:03.328805 | orchestrator | 2025-09-27 02:49:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:03.330363 | orchestrator | 2025-09-27 02:49:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:03.330404 | orchestrator | 2025-09-27 02:49:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:06.373203 | orchestrator | 2025-09-27 02:49:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:06.374407 | orchestrator | 2025-09-27 02:49:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:06.375336 | orchestrator | 2025-09-27 02:49:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:09.419655 | orchestrator | 2025-09-27 02:49:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:09.420354 | orchestrator | 2025-09-27 02:49:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:09.420384 | orchestrator | 2025-09-27 02:49:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:12.465842 | orchestrator | 2025-09-27 02:49:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:12.466573 | orchestrator | 2025-09-27 02:49:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:12.466617 | orchestrator | 2025-09-27 02:49:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:15.515361 | orchestrator | 2025-09-27 02:49:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:15.516415 | orchestrator | 2025-09-27 02:49:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:15.516795 | orchestrator | 2025-09-27 02:49:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:18.556059 | orchestrator | 2025-09-27 02:49:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:18.557572 | orchestrator | 2025-09-27 02:49:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:18.557606 | orchestrator | 2025-09-27 02:49:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:21.611913 | orchestrator | 2025-09-27 02:49:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:21.613454 | orchestrator | 2025-09-27 02:49:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:21.613485 | orchestrator | 2025-09-27 02:49:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:24.658292 | orchestrator | 2025-09-27 02:49:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:24.660649 | orchestrator | 2025-09-27 02:49:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:24.660727 | orchestrator | 2025-09-27 02:49:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:27.705838 | orchestrator | 2025-09-27 02:49:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:27.706870 | orchestrator | 2025-09-27 02:49:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:27.706905 | orchestrator | 2025-09-27 02:49:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:30.752938 | orchestrator | 2025-09-27 02:49:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:30.756930 | orchestrator | 2025-09-27 02:49:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:30.757010 | orchestrator | 2025-09-27 02:49:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:33.802182 | orchestrator | 2025-09-27 02:49:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:33.804079 | orchestrator | 2025-09-27 02:49:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:33.804111 | orchestrator | 2025-09-27 02:49:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:36.851232 | orchestrator | 2025-09-27 02:49:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:36.853068 | orchestrator | 2025-09-27 02:49:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:36.853103 | orchestrator | 2025-09-27 02:49:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:39.902806 | orchestrator | 2025-09-27 02:49:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:39.904911 | orchestrator | 2025-09-27 02:49:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:39.904944 | orchestrator | 2025-09-27 02:49:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:42.948518 | orchestrator | 2025-09-27 02:49:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:42.950074 | orchestrator | 2025-09-27 02:49:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:42.950111 | orchestrator | 2025-09-27 02:49:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:45.997240 | orchestrator | 2025-09-27 02:49:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:45.998376 | orchestrator | 2025-09-27 02:49:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:45.998728 | orchestrator | 2025-09-27 02:49:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:49.045899 | orchestrator | 2025-09-27 02:49:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:49.047196 | orchestrator | 2025-09-27 02:49:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:49.047225 | orchestrator | 2025-09-27 02:49:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:52.099395 | orchestrator | 2025-09-27 02:49:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:52.101038 | orchestrator | 2025-09-27 02:49:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:52.101285 | orchestrator | 2025-09-27 02:49:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:55.141906 | orchestrator | 2025-09-27 02:49:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:55.143999 | orchestrator | 2025-09-27 02:49:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:55.144042 | orchestrator | 2025-09-27 02:49:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:49:58.189345 | orchestrator | 2025-09-27 02:49:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:49:58.191374 | orchestrator | 2025-09-27 02:49:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:49:58.191418 | orchestrator | 2025-09-27 02:49:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:01.237232 | orchestrator | 2025-09-27 02:50:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:01.238973 | orchestrator | 2025-09-27 02:50:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:01.239097 | orchestrator | 2025-09-27 02:50:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:04.286617 | orchestrator | 2025-09-27 02:50:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:04.288251 | orchestrator | 2025-09-27 02:50:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:04.288338 | orchestrator | 2025-09-27 02:50:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:07.331975 | orchestrator | 2025-09-27 02:50:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:07.333172 | orchestrator | 2025-09-27 02:50:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:07.333245 | orchestrator | 2025-09-27 02:50:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:10.376764 | orchestrator | 2025-09-27 02:50:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:10.378223 | orchestrator | 2025-09-27 02:50:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:10.378269 | orchestrator | 2025-09-27 02:50:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:13.421438 | orchestrator | 2025-09-27 02:50:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:13.423923 | orchestrator | 2025-09-27 02:50:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:13.424009 | orchestrator | 2025-09-27 02:50:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:16.465395 | orchestrator | 2025-09-27 02:50:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:16.466966 | orchestrator | 2025-09-27 02:50:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:16.467001 | orchestrator | 2025-09-27 02:50:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:19.518143 | orchestrator | 2025-09-27 02:50:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:19.519931 | orchestrator | 2025-09-27 02:50:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:19.519964 | orchestrator | 2025-09-27 02:50:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:22.565734 | orchestrator | 2025-09-27 02:50:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:22.567571 | orchestrator | 2025-09-27 02:50:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:22.567755 | orchestrator | 2025-09-27 02:50:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:25.614347 | orchestrator | 2025-09-27 02:50:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:25.617917 | orchestrator | 2025-09-27 02:50:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:25.617995 | orchestrator | 2025-09-27 02:50:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:28.667977 | orchestrator | 2025-09-27 02:50:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:28.669441 | orchestrator | 2025-09-27 02:50:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:28.669490 | orchestrator | 2025-09-27 02:50:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:31.715507 | orchestrator | 2025-09-27 02:50:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:31.716990 | orchestrator | 2025-09-27 02:50:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:31.717109 | orchestrator | 2025-09-27 02:50:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:34.762539 | orchestrator | 2025-09-27 02:50:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:34.763960 | orchestrator | 2025-09-27 02:50:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:34.764209 | orchestrator | 2025-09-27 02:50:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:37.807718 | orchestrator | 2025-09-27 02:50:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:37.809801 | orchestrator | 2025-09-27 02:50:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:37.809833 | orchestrator | 2025-09-27 02:50:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:40.856902 | orchestrator | 2025-09-27 02:50:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:40.858358 | orchestrator | 2025-09-27 02:50:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:40.858395 | orchestrator | 2025-09-27 02:50:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:43.905176 | orchestrator | 2025-09-27 02:50:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:43.906307 | orchestrator | 2025-09-27 02:50:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:43.906350 | orchestrator | 2025-09-27 02:50:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:46.953566 | orchestrator | 2025-09-27 02:50:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:46.955432 | orchestrator | 2025-09-27 02:50:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:46.955521 | orchestrator | 2025-09-27 02:50:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:50.002948 | orchestrator | 2025-09-27 02:50:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:50.004237 | orchestrator | 2025-09-27 02:50:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:50.004321 | orchestrator | 2025-09-27 02:50:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:53.052815 | orchestrator | 2025-09-27 02:50:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:53.054186 | orchestrator | 2025-09-27 02:50:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:53.054506 | orchestrator | 2025-09-27 02:50:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:56.098750 | orchestrator | 2025-09-27 02:50:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:56.100208 | orchestrator | 2025-09-27 02:50:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:56.100314 | orchestrator | 2025-09-27 02:50:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:50:59.149930 | orchestrator | 2025-09-27 02:50:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:50:59.151688 | orchestrator | 2025-09-27 02:50:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:50:59.151721 | orchestrator | 2025-09-27 02:50:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:02.200125 | orchestrator | 2025-09-27 02:51:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:02.201736 | orchestrator | 2025-09-27 02:51:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:02.201886 | orchestrator | 2025-09-27 02:51:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:05.246271 | orchestrator | 2025-09-27 02:51:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:05.248915 | orchestrator | 2025-09-27 02:51:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:05.249195 | orchestrator | 2025-09-27 02:51:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:08.293519 | orchestrator | 2025-09-27 02:51:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:08.295479 | orchestrator | 2025-09-27 02:51:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:08.295519 | orchestrator | 2025-09-27 02:51:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:11.339499 | orchestrator | 2025-09-27 02:51:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:11.342826 | orchestrator | 2025-09-27 02:51:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:11.342941 | orchestrator | 2025-09-27 02:51:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:14.392138 | orchestrator | 2025-09-27 02:51:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:14.393450 | orchestrator | 2025-09-27 02:51:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:14.393478 | orchestrator | 2025-09-27 02:51:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:17.438454 | orchestrator | 2025-09-27 02:51:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:17.440341 | orchestrator | 2025-09-27 02:51:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:17.440395 | orchestrator | 2025-09-27 02:51:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:20.486717 | orchestrator | 2025-09-27 02:51:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:20.487969 | orchestrator | 2025-09-27 02:51:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:20.488005 | orchestrator | 2025-09-27 02:51:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:23.532447 | orchestrator | 2025-09-27 02:51:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:23.534470 | orchestrator | 2025-09-27 02:51:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:23.534504 | orchestrator | 2025-09-27 02:51:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:26.577722 | orchestrator | 2025-09-27 02:51:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:26.578146 | orchestrator | 2025-09-27 02:51:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:26.578180 | orchestrator | 2025-09-27 02:51:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:29.621709 | orchestrator | 2025-09-27 02:51:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:29.624561 | orchestrator | 2025-09-27 02:51:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:29.624688 | orchestrator | 2025-09-27 02:51:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:32.667563 | orchestrator | 2025-09-27 02:51:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:32.669023 | orchestrator | 2025-09-27 02:51:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:32.669058 | orchestrator | 2025-09-27 02:51:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:35.716966 | orchestrator | 2025-09-27 02:51:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:35.719146 | orchestrator | 2025-09-27 02:51:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:35.719237 | orchestrator | 2025-09-27 02:51:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:38.762189 | orchestrator | 2025-09-27 02:51:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:38.764812 | orchestrator | 2025-09-27 02:51:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:38.764846 | orchestrator | 2025-09-27 02:51:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:41.815001 | orchestrator | 2025-09-27 02:51:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:41.817046 | orchestrator | 2025-09-27 02:51:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:41.817078 | orchestrator | 2025-09-27 02:51:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:44.861679 | orchestrator | 2025-09-27 02:51:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:44.862572 | orchestrator | 2025-09-27 02:51:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:44.862805 | orchestrator | 2025-09-27 02:51:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:47.910641 | orchestrator | 2025-09-27 02:51:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:47.912578 | orchestrator | 2025-09-27 02:51:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:47.913015 | orchestrator | 2025-09-27 02:51:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:50.954797 | orchestrator | 2025-09-27 02:51:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:50.956121 | orchestrator | 2025-09-27 02:51:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:50.956153 | orchestrator | 2025-09-27 02:51:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:54.009300 | orchestrator | 2025-09-27 02:51:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:54.010165 | orchestrator | 2025-09-27 02:51:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:54.010663 | orchestrator | 2025-09-27 02:51:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:51:57.055395 | orchestrator | 2025-09-27 02:51:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:51:57.057874 | orchestrator | 2025-09-27 02:51:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:51:57.058402 | orchestrator | 2025-09-27 02:51:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:00.107241 | orchestrator | 2025-09-27 02:52:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:00.108765 | orchestrator | 2025-09-27 02:52:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:00.108952 | orchestrator | 2025-09-27 02:52:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:03.155243 | orchestrator | 2025-09-27 02:52:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:03.155957 | orchestrator | 2025-09-27 02:52:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:03.155991 | orchestrator | 2025-09-27 02:52:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:06.202798 | orchestrator | 2025-09-27 02:52:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:06.204163 | orchestrator | 2025-09-27 02:52:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:06.204238 | orchestrator | 2025-09-27 02:52:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:09.246998 | orchestrator | 2025-09-27 02:52:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:09.248694 | orchestrator | 2025-09-27 02:52:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:09.248719 | orchestrator | 2025-09-27 02:52:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:12.296533 | orchestrator | 2025-09-27 02:52:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:12.297840 | orchestrator | 2025-09-27 02:52:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:12.298124 | orchestrator | 2025-09-27 02:52:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:15.342316 | orchestrator | 2025-09-27 02:52:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:15.342768 | orchestrator | 2025-09-27 02:52:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:15.342806 | orchestrator | 2025-09-27 02:52:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:18.387368 | orchestrator | 2025-09-27 02:52:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:18.388423 | orchestrator | 2025-09-27 02:52:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:18.389003 | orchestrator | 2025-09-27 02:52:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:21.430337 | orchestrator | 2025-09-27 02:52:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:21.432922 | orchestrator | 2025-09-27 02:52:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:21.432954 | orchestrator | 2025-09-27 02:52:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:24.481839 | orchestrator | 2025-09-27 02:52:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:24.483445 | orchestrator | 2025-09-27 02:52:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:24.483479 | orchestrator | 2025-09-27 02:52:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:27.532547 | orchestrator | 2025-09-27 02:52:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:27.535366 | orchestrator | 2025-09-27 02:52:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:27.535433 | orchestrator | 2025-09-27 02:52:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:30.582586 | orchestrator | 2025-09-27 02:52:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:30.584163 | orchestrator | 2025-09-27 02:52:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:30.584206 | orchestrator | 2025-09-27 02:52:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:33.624362 | orchestrator | 2025-09-27 02:52:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:33.625448 | orchestrator | 2025-09-27 02:52:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:33.625482 | orchestrator | 2025-09-27 02:52:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:36.671347 | orchestrator | 2025-09-27 02:52:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:36.672959 | orchestrator | 2025-09-27 02:52:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:36.673075 | orchestrator | 2025-09-27 02:52:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:39.722982 | orchestrator | 2025-09-27 02:52:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:39.725152 | orchestrator | 2025-09-27 02:52:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:39.725184 | orchestrator | 2025-09-27 02:52:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:42.770643 | orchestrator | 2025-09-27 02:52:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:42.771940 | orchestrator | 2025-09-27 02:52:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:42.771973 | orchestrator | 2025-09-27 02:52:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:45.819170 | orchestrator | 2025-09-27 02:52:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:45.820761 | orchestrator | 2025-09-27 02:52:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:45.820798 | orchestrator | 2025-09-27 02:52:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:48.867228 | orchestrator | 2025-09-27 02:52:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:48.868378 | orchestrator | 2025-09-27 02:52:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:48.868575 | orchestrator | 2025-09-27 02:52:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:51.917967 | orchestrator | 2025-09-27 02:52:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:51.920298 | orchestrator | 2025-09-27 02:52:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:51.920332 | orchestrator | 2025-09-27 02:52:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:54.964669 | orchestrator | 2025-09-27 02:52:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:54.966842 | orchestrator | 2025-09-27 02:52:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:54.966876 | orchestrator | 2025-09-27 02:52:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:52:58.014546 | orchestrator | 2025-09-27 02:52:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:52:58.018673 | orchestrator | 2025-09-27 02:52:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:52:58.018765 | orchestrator | 2025-09-27 02:52:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:01.067423 | orchestrator | 2025-09-27 02:53:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:01.068541 | orchestrator | 2025-09-27 02:53:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:01.068574 | orchestrator | 2025-09-27 02:53:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:04.112757 | orchestrator | 2025-09-27 02:53:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:04.114902 | orchestrator | 2025-09-27 02:53:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:04.114950 | orchestrator | 2025-09-27 02:53:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:07.161927 | orchestrator | 2025-09-27 02:53:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:07.164480 | orchestrator | 2025-09-27 02:53:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:07.164749 | orchestrator | 2025-09-27 02:53:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:10.209525 | orchestrator | 2025-09-27 02:53:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:10.210968 | orchestrator | 2025-09-27 02:53:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:10.211225 | orchestrator | 2025-09-27 02:53:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:13.262827 | orchestrator | 2025-09-27 02:53:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:13.264108 | orchestrator | 2025-09-27 02:53:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:13.264139 | orchestrator | 2025-09-27 02:53:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:16.310890 | orchestrator | 2025-09-27 02:53:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:16.313223 | orchestrator | 2025-09-27 02:53:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:16.313256 | orchestrator | 2025-09-27 02:53:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:19.355162 | orchestrator | 2025-09-27 02:53:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:19.356854 | orchestrator | 2025-09-27 02:53:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:19.356932 | orchestrator | 2025-09-27 02:53:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:22.401297 | orchestrator | 2025-09-27 02:53:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:22.403024 | orchestrator | 2025-09-27 02:53:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:22.403070 | orchestrator | 2025-09-27 02:53:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:25.446852 | orchestrator | 2025-09-27 02:53:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:25.448649 | orchestrator | 2025-09-27 02:53:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:25.448677 | orchestrator | 2025-09-27 02:53:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:28.493541 | orchestrator | 2025-09-27 02:53:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:28.495794 | orchestrator | 2025-09-27 02:53:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:28.495965 | orchestrator | 2025-09-27 02:53:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:31.541536 | orchestrator | 2025-09-27 02:53:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:31.542885 | orchestrator | 2025-09-27 02:53:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:31.542919 | orchestrator | 2025-09-27 02:53:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:34.589160 | orchestrator | 2025-09-27 02:53:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:34.592303 | orchestrator | 2025-09-27 02:53:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:34.593296 | orchestrator | 2025-09-27 02:53:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:37.641488 | orchestrator | 2025-09-27 02:53:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:37.643814 | orchestrator | 2025-09-27 02:53:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:37.643871 | orchestrator | 2025-09-27 02:53:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:40.691338 | orchestrator | 2025-09-27 02:53:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:40.692985 | orchestrator | 2025-09-27 02:53:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:40.693015 | orchestrator | 2025-09-27 02:53:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:43.736751 | orchestrator | 2025-09-27 02:53:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:43.738349 | orchestrator | 2025-09-27 02:53:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:43.738446 | orchestrator | 2025-09-27 02:53:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:46.792761 | orchestrator | 2025-09-27 02:53:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:46.794507 | orchestrator | 2025-09-27 02:53:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:46.794614 | orchestrator | 2025-09-27 02:53:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:49.839611 | orchestrator | 2025-09-27 02:53:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:49.841378 | orchestrator | 2025-09-27 02:53:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:49.841406 | orchestrator | 2025-09-27 02:53:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:52.881490 | orchestrator | 2025-09-27 02:53:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:52.883454 | orchestrator | 2025-09-27 02:53:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:52.883676 | orchestrator | 2025-09-27 02:53:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:55.927530 | orchestrator | 2025-09-27 02:53:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:55.929493 | orchestrator | 2025-09-27 02:53:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:55.929522 | orchestrator | 2025-09-27 02:53:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:53:58.976865 | orchestrator | 2025-09-27 02:53:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:53:58.978372 | orchestrator | 2025-09-27 02:53:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:53:58.978743 | orchestrator | 2025-09-27 02:53:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:02.026253 | orchestrator | 2025-09-27 02:54:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:02.028352 | orchestrator | 2025-09-27 02:54:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:02.028638 | orchestrator | 2025-09-27 02:54:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:05.071921 | orchestrator | 2025-09-27 02:54:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:05.074854 | orchestrator | 2025-09-27 02:54:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:05.074896 | orchestrator | 2025-09-27 02:54:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:08.121076 | orchestrator | 2025-09-27 02:54:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:08.123184 | orchestrator | 2025-09-27 02:54:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:08.123300 | orchestrator | 2025-09-27 02:54:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:11.172339 | orchestrator | 2025-09-27 02:54:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:11.173483 | orchestrator | 2025-09-27 02:54:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:11.173566 | orchestrator | 2025-09-27 02:54:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:14.222173 | orchestrator | 2025-09-27 02:54:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:14.223626 | orchestrator | 2025-09-27 02:54:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:14.223656 | orchestrator | 2025-09-27 02:54:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:17.264311 | orchestrator | 2025-09-27 02:54:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:17.265516 | orchestrator | 2025-09-27 02:54:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:17.265631 | orchestrator | 2025-09-27 02:54:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:20.316194 | orchestrator | 2025-09-27 02:54:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:20.318081 | orchestrator | 2025-09-27 02:54:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:20.318124 | orchestrator | 2025-09-27 02:54:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:23.363620 | orchestrator | 2025-09-27 02:54:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:23.365933 | orchestrator | 2025-09-27 02:54:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:23.365982 | orchestrator | 2025-09-27 02:54:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:26.408690 | orchestrator | 2025-09-27 02:54:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:26.409843 | orchestrator | 2025-09-27 02:54:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:26.409906 | orchestrator | 2025-09-27 02:54:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:29.450868 | orchestrator | 2025-09-27 02:54:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:29.452070 | orchestrator | 2025-09-27 02:54:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:29.452360 | orchestrator | 2025-09-27 02:54:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:32.493650 | orchestrator | 2025-09-27 02:54:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:32.494007 | orchestrator | 2025-09-27 02:54:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:32.494106 | orchestrator | 2025-09-27 02:54:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:35.541056 | orchestrator | 2025-09-27 02:54:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:35.542288 | orchestrator | 2025-09-27 02:54:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:35.542321 | orchestrator | 2025-09-27 02:54:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:38.579907 | orchestrator | 2025-09-27 02:54:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:38.581546 | orchestrator | 2025-09-27 02:54:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:38.581616 | orchestrator | 2025-09-27 02:54:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:41.629647 | orchestrator | 2025-09-27 02:54:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:41.630634 | orchestrator | 2025-09-27 02:54:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:41.630706 | orchestrator | 2025-09-27 02:54:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:44.680708 | orchestrator | 2025-09-27 02:54:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:44.682176 | orchestrator | 2025-09-27 02:54:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:44.682254 | orchestrator | 2025-09-27 02:54:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:47.727472 | orchestrator | 2025-09-27 02:54:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:47.809196 | orchestrator | 2025-09-27 02:54:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:47.809279 | orchestrator | 2025-09-27 02:54:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:50.778533 | orchestrator | 2025-09-27 02:54:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:50.780735 | orchestrator | 2025-09-27 02:54:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:50.780794 | orchestrator | 2025-09-27 02:54:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:53.828960 | orchestrator | 2025-09-27 02:54:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:53.830895 | orchestrator | 2025-09-27 02:54:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:53.830971 | orchestrator | 2025-09-27 02:54:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:56.876833 | orchestrator | 2025-09-27 02:54:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:56.878925 | orchestrator | 2025-09-27 02:54:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:56.879014 | orchestrator | 2025-09-27 02:54:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:54:59.923624 | orchestrator | 2025-09-27 02:54:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:54:59.925025 | orchestrator | 2025-09-27 02:54:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:54:59.925073 | orchestrator | 2025-09-27 02:54:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:02.972245 | orchestrator | 2025-09-27 02:55:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:02.975355 | orchestrator | 2025-09-27 02:55:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:02.975387 | orchestrator | 2025-09-27 02:55:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:06.021746 | orchestrator | 2025-09-27 02:55:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:06.023637 | orchestrator | 2025-09-27 02:55:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:06.023725 | orchestrator | 2025-09-27 02:55:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:09.070721 | orchestrator | 2025-09-27 02:55:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:09.071775 | orchestrator | 2025-09-27 02:55:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:09.071802 | orchestrator | 2025-09-27 02:55:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:12.122657 | orchestrator | 2025-09-27 02:55:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:12.124993 | orchestrator | 2025-09-27 02:55:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:12.125027 | orchestrator | 2025-09-27 02:55:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:15.169858 | orchestrator | 2025-09-27 02:55:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:15.170235 | orchestrator | 2025-09-27 02:55:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:15.170546 | orchestrator | 2025-09-27 02:55:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:18.214336 | orchestrator | 2025-09-27 02:55:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:18.215505 | orchestrator | 2025-09-27 02:55:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:18.215651 | orchestrator | 2025-09-27 02:55:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:21.262849 | orchestrator | 2025-09-27 02:55:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:21.264252 | orchestrator | 2025-09-27 02:55:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:21.264333 | orchestrator | 2025-09-27 02:55:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:24.307317 | orchestrator | 2025-09-27 02:55:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:24.309353 | orchestrator | 2025-09-27 02:55:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:24.309516 | orchestrator | 2025-09-27 02:55:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:27.351273 | orchestrator | 2025-09-27 02:55:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:27.352569 | orchestrator | 2025-09-27 02:55:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:27.352889 | orchestrator | 2025-09-27 02:55:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:30.396380 | orchestrator | 2025-09-27 02:55:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:30.397222 | orchestrator | 2025-09-27 02:55:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:30.397246 | orchestrator | 2025-09-27 02:55:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:33.446390 | orchestrator | 2025-09-27 02:55:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:33.447281 | orchestrator | 2025-09-27 02:55:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:33.447387 | orchestrator | 2025-09-27 02:55:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:36.497455 | orchestrator | 2025-09-27 02:55:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:36.501294 | orchestrator | 2025-09-27 02:55:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:36.502137 | orchestrator | 2025-09-27 02:55:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:39.548269 | orchestrator | 2025-09-27 02:55:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:39.549658 | orchestrator | 2025-09-27 02:55:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:39.549703 | orchestrator | 2025-09-27 02:55:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:42.599304 | orchestrator | 2025-09-27 02:55:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:42.600465 | orchestrator | 2025-09-27 02:55:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:42.600750 | orchestrator | 2025-09-27 02:55:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:45.639332 | orchestrator | 2025-09-27 02:55:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:45.640250 | orchestrator | 2025-09-27 02:55:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:45.640494 | orchestrator | 2025-09-27 02:55:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:48.690830 | orchestrator | 2025-09-27 02:55:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:48.691678 | orchestrator | 2025-09-27 02:55:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:48.692136 | orchestrator | 2025-09-27 02:55:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:51.744147 | orchestrator | 2025-09-27 02:55:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:51.745358 | orchestrator | 2025-09-27 02:55:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:51.745669 | orchestrator | 2025-09-27 02:55:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:54.792639 | orchestrator | 2025-09-27 02:55:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:54.794436 | orchestrator | 2025-09-27 02:55:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:54.794519 | orchestrator | 2025-09-27 02:55:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:55:57.839925 | orchestrator | 2025-09-27 02:55:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:55:57.841111 | orchestrator | 2025-09-27 02:55:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:55:57.841621 | orchestrator | 2025-09-27 02:55:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:00.888024 | orchestrator | 2025-09-27 02:56:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:00.889021 | orchestrator | 2025-09-27 02:56:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:00.889104 | orchestrator | 2025-09-27 02:56:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:03.930309 | orchestrator | 2025-09-27 02:56:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:03.931934 | orchestrator | 2025-09-27 02:56:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:03.932129 | orchestrator | 2025-09-27 02:56:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:06.974332 | orchestrator | 2025-09-27 02:56:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:06.975408 | orchestrator | 2025-09-27 02:56:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:06.975656 | orchestrator | 2025-09-27 02:56:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:10.025920 | orchestrator | 2025-09-27 02:56:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:10.027780 | orchestrator | 2025-09-27 02:56:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:10.027811 | orchestrator | 2025-09-27 02:56:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:13.083992 | orchestrator | 2025-09-27 02:56:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:13.086010 | orchestrator | 2025-09-27 02:56:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:13.086119 | orchestrator | 2025-09-27 02:56:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:16.128559 | orchestrator | 2025-09-27 02:56:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:16.130754 | orchestrator | 2025-09-27 02:56:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:16.131019 | orchestrator | 2025-09-27 02:56:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:19.173424 | orchestrator | 2025-09-27 02:56:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:19.175655 | orchestrator | 2025-09-27 02:56:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:19.175689 | orchestrator | 2025-09-27 02:56:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:22.219847 | orchestrator | 2025-09-27 02:56:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:22.220948 | orchestrator | 2025-09-27 02:56:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:22.221264 | orchestrator | 2025-09-27 02:56:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:25.267901 | orchestrator | 2025-09-27 02:56:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:25.268659 | orchestrator | 2025-09-27 02:56:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:25.268758 | orchestrator | 2025-09-27 02:56:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:28.316669 | orchestrator | 2025-09-27 02:56:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:28.318318 | orchestrator | 2025-09-27 02:56:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:28.318347 | orchestrator | 2025-09-27 02:56:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:31.359436 | orchestrator | 2025-09-27 02:56:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:31.361952 | orchestrator | 2025-09-27 02:56:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:31.362291 | orchestrator | 2025-09-27 02:56:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:34.410826 | orchestrator | 2025-09-27 02:56:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:34.412328 | orchestrator | 2025-09-27 02:56:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:34.412557 | orchestrator | 2025-09-27 02:56:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:37.455219 | orchestrator | 2025-09-27 02:56:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:37.456754 | orchestrator | 2025-09-27 02:56:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:37.456785 | orchestrator | 2025-09-27 02:56:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:40.502365 | orchestrator | 2025-09-27 02:56:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:40.505743 | orchestrator | 2025-09-27 02:56:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:40.505773 | orchestrator | 2025-09-27 02:56:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:43.550396 | orchestrator | 2025-09-27 02:56:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:43.552054 | orchestrator | 2025-09-27 02:56:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:43.552081 | orchestrator | 2025-09-27 02:56:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:46.594874 | orchestrator | 2025-09-27 02:56:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:46.597173 | orchestrator | 2025-09-27 02:56:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:46.597341 | orchestrator | 2025-09-27 02:56:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:49.639910 | orchestrator | 2025-09-27 02:56:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:49.642126 | orchestrator | 2025-09-27 02:56:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:49.642161 | orchestrator | 2025-09-27 02:56:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:52.684406 | orchestrator | 2025-09-27 02:56:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:52.686661 | orchestrator | 2025-09-27 02:56:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:52.686698 | orchestrator | 2025-09-27 02:56:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:55.732527 | orchestrator | 2025-09-27 02:56:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:55.733515 | orchestrator | 2025-09-27 02:56:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:55.733614 | orchestrator | 2025-09-27 02:56:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:56:58.779873 | orchestrator | 2025-09-27 02:56:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:56:58.781776 | orchestrator | 2025-09-27 02:56:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:56:58.781827 | orchestrator | 2025-09-27 02:56:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:01.832252 | orchestrator | 2025-09-27 02:57:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:01.834436 | orchestrator | 2025-09-27 02:57:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:01.834472 | orchestrator | 2025-09-27 02:57:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:04.878814 | orchestrator | 2025-09-27 02:57:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:04.879741 | orchestrator | 2025-09-27 02:57:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:04.879775 | orchestrator | 2025-09-27 02:57:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:07.928036 | orchestrator | 2025-09-27 02:57:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:07.929381 | orchestrator | 2025-09-27 02:57:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:07.929437 | orchestrator | 2025-09-27 02:57:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:10.976790 | orchestrator | 2025-09-27 02:57:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:10.977818 | orchestrator | 2025-09-27 02:57:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:10.977867 | orchestrator | 2025-09-27 02:57:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:14.028398 | orchestrator | 2025-09-27 02:57:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:14.029739 | orchestrator | 2025-09-27 02:57:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:14.029994 | orchestrator | 2025-09-27 02:57:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:17.073783 | orchestrator | 2025-09-27 02:57:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:17.074635 | orchestrator | 2025-09-27 02:57:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:17.074666 | orchestrator | 2025-09-27 02:57:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:20.120212 | orchestrator | 2025-09-27 02:57:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:20.122707 | orchestrator | 2025-09-27 02:57:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:20.122738 | orchestrator | 2025-09-27 02:57:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:23.168270 | orchestrator | 2025-09-27 02:57:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:23.170917 | orchestrator | 2025-09-27 02:57:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:23.171012 | orchestrator | 2025-09-27 02:57:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:26.213285 | orchestrator | 2025-09-27 02:57:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:26.214955 | orchestrator | 2025-09-27 02:57:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:26.214996 | orchestrator | 2025-09-27 02:57:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:29.259776 | orchestrator | 2025-09-27 02:57:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:29.260992 | orchestrator | 2025-09-27 02:57:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:29.261023 | orchestrator | 2025-09-27 02:57:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:32.312365 | orchestrator | 2025-09-27 02:57:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:32.314112 | orchestrator | 2025-09-27 02:57:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:32.314157 | orchestrator | 2025-09-27 02:57:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:35.357880 | orchestrator | 2025-09-27 02:57:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:35.358915 | orchestrator | 2025-09-27 02:57:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:35.358997 | orchestrator | 2025-09-27 02:57:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:38.406385 | orchestrator | 2025-09-27 02:57:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:38.408790 | orchestrator | 2025-09-27 02:57:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:38.408994 | orchestrator | 2025-09-27 02:57:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:41.454722 | orchestrator | 2025-09-27 02:57:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:41.455733 | orchestrator | 2025-09-27 02:57:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:41.455767 | orchestrator | 2025-09-27 02:57:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:44.505095 | orchestrator | 2025-09-27 02:57:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:44.506781 | orchestrator | 2025-09-27 02:57:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:44.506994 | orchestrator | 2025-09-27 02:57:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:47.549640 | orchestrator | 2025-09-27 02:57:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:47.550963 | orchestrator | 2025-09-27 02:57:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:47.550991 | orchestrator | 2025-09-27 02:57:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:50.602338 | orchestrator | 2025-09-27 02:57:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:50.603700 | orchestrator | 2025-09-27 02:57:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:50.603738 | orchestrator | 2025-09-27 02:57:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:53.648746 | orchestrator | 2025-09-27 02:57:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:53.650209 | orchestrator | 2025-09-27 02:57:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:53.650241 | orchestrator | 2025-09-27 02:57:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:56.695616 | orchestrator | 2025-09-27 02:57:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:56.697046 | orchestrator | 2025-09-27 02:57:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:56.697538 | orchestrator | 2025-09-27 02:57:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:57:59.744363 | orchestrator | 2025-09-27 02:57:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:57:59.745814 | orchestrator | 2025-09-27 02:57:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:57:59.745842 | orchestrator | 2025-09-27 02:57:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:02.790385 | orchestrator | 2025-09-27 02:58:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:02.792347 | orchestrator | 2025-09-27 02:58:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:02.792512 | orchestrator | 2025-09-27 02:58:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:05.837406 | orchestrator | 2025-09-27 02:58:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:05.838416 | orchestrator | 2025-09-27 02:58:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:05.838469 | orchestrator | 2025-09-27 02:58:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:08.879072 | orchestrator | 2025-09-27 02:58:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:08.880259 | orchestrator | 2025-09-27 02:58:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:08.880289 | orchestrator | 2025-09-27 02:58:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:11.927394 | orchestrator | 2025-09-27 02:58:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:11.928983 | orchestrator | 2025-09-27 02:58:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:11.929013 | orchestrator | 2025-09-27 02:58:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:14.972348 | orchestrator | 2025-09-27 02:58:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:14.973787 | orchestrator | 2025-09-27 02:58:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:14.973863 | orchestrator | 2025-09-27 02:58:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:18.018090 | orchestrator | 2025-09-27 02:58:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:18.018284 | orchestrator | 2025-09-27 02:58:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:18.018305 | orchestrator | 2025-09-27 02:58:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:21.064176 | orchestrator | 2025-09-27 02:58:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:21.065959 | orchestrator | 2025-09-27 02:58:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:21.065990 | orchestrator | 2025-09-27 02:58:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:24.112166 | orchestrator | 2025-09-27 02:58:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:24.113594 | orchestrator | 2025-09-27 02:58:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:24.113623 | orchestrator | 2025-09-27 02:58:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:27.161167 | orchestrator | 2025-09-27 02:58:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:27.162281 | orchestrator | 2025-09-27 02:58:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:27.162319 | orchestrator | 2025-09-27 02:58:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:30.208955 | orchestrator | 2025-09-27 02:58:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:30.210118 | orchestrator | 2025-09-27 02:58:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:30.210152 | orchestrator | 2025-09-27 02:58:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:33.253769 | orchestrator | 2025-09-27 02:58:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:33.255754 | orchestrator | 2025-09-27 02:58:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:33.255787 | orchestrator | 2025-09-27 02:58:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:36.307823 | orchestrator | 2025-09-27 02:58:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:36.309731 | orchestrator | 2025-09-27 02:58:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:36.309813 | orchestrator | 2025-09-27 02:58:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:39.358428 | orchestrator | 2025-09-27 02:58:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:39.360168 | orchestrator | 2025-09-27 02:58:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:39.360246 | orchestrator | 2025-09-27 02:58:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:42.405188 | orchestrator | 2025-09-27 02:58:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:42.406505 | orchestrator | 2025-09-27 02:58:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:42.406647 | orchestrator | 2025-09-27 02:58:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:45.451716 | orchestrator | 2025-09-27 02:58:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:45.453307 | orchestrator | 2025-09-27 02:58:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:45.453336 | orchestrator | 2025-09-27 02:58:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:48.495835 | orchestrator | 2025-09-27 02:58:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:48.497002 | orchestrator | 2025-09-27 02:58:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:48.497026 | orchestrator | 2025-09-27 02:58:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:51.538347 | orchestrator | 2025-09-27 02:58:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:51.539902 | orchestrator | 2025-09-27 02:58:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:51.539977 | orchestrator | 2025-09-27 02:58:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:54.584776 | orchestrator | 2025-09-27 02:58:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:54.586842 | orchestrator | 2025-09-27 02:58:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:54.586872 | orchestrator | 2025-09-27 02:58:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:58:57.636103 | orchestrator | 2025-09-27 02:58:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:58:57.637642 | orchestrator | 2025-09-27 02:58:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:58:57.637676 | orchestrator | 2025-09-27 02:58:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:00.686366 | orchestrator | 2025-09-27 02:59:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:00.687702 | orchestrator | 2025-09-27 02:59:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:00.688255 | orchestrator | 2025-09-27 02:59:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:03.744603 | orchestrator | 2025-09-27 02:59:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:03.747080 | orchestrator | 2025-09-27 02:59:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:03.747158 | orchestrator | 2025-09-27 02:59:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:06.793901 | orchestrator | 2025-09-27 02:59:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:06.794842 | orchestrator | 2025-09-27 02:59:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:06.795274 | orchestrator | 2025-09-27 02:59:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:09.840912 | orchestrator | 2025-09-27 02:59:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:09.841820 | orchestrator | 2025-09-27 02:59:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:09.841905 | orchestrator | 2025-09-27 02:59:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:12.887158 | orchestrator | 2025-09-27 02:59:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:12.888886 | orchestrator | 2025-09-27 02:59:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:12.889213 | orchestrator | 2025-09-27 02:59:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:15.935720 | orchestrator | 2025-09-27 02:59:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:15.936605 | orchestrator | 2025-09-27 02:59:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:15.936633 | orchestrator | 2025-09-27 02:59:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:18.982121 | orchestrator | 2025-09-27 02:59:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:18.983627 | orchestrator | 2025-09-27 02:59:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:18.983659 | orchestrator | 2025-09-27 02:59:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:22.028361 | orchestrator | 2025-09-27 02:59:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:22.031035 | orchestrator | 2025-09-27 02:59:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:22.031103 | orchestrator | 2025-09-27 02:59:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:25.076349 | orchestrator | 2025-09-27 02:59:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:25.077692 | orchestrator | 2025-09-27 02:59:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:25.077915 | orchestrator | 2025-09-27 02:59:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:28.121568 | orchestrator | 2025-09-27 02:59:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:28.123072 | orchestrator | 2025-09-27 02:59:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:28.123108 | orchestrator | 2025-09-27 02:59:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:31.166864 | orchestrator | 2025-09-27 02:59:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:31.168232 | orchestrator | 2025-09-27 02:59:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:31.168307 | orchestrator | 2025-09-27 02:59:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:34.216058 | orchestrator | 2025-09-27 02:59:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:34.217513 | orchestrator | 2025-09-27 02:59:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:34.217594 | orchestrator | 2025-09-27 02:59:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:37.266150 | orchestrator | 2025-09-27 02:59:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:37.267967 | orchestrator | 2025-09-27 02:59:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:37.267993 | orchestrator | 2025-09-27 02:59:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:40.309314 | orchestrator | 2025-09-27 02:59:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:40.312796 | orchestrator | 2025-09-27 02:59:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:40.312897 | orchestrator | 2025-09-27 02:59:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:43.355286 | orchestrator | 2025-09-27 02:59:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:43.357375 | orchestrator | 2025-09-27 02:59:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:43.357405 | orchestrator | 2025-09-27 02:59:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:46.404811 | orchestrator | 2025-09-27 02:59:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:46.405872 | orchestrator | 2025-09-27 02:59:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:46.405956 | orchestrator | 2025-09-27 02:59:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:49.459889 | orchestrator | 2025-09-27 02:59:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:49.461923 | orchestrator | 2025-09-27 02:59:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:49.461977 | orchestrator | 2025-09-27 02:59:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:52.510165 | orchestrator | 2025-09-27 02:59:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:52.511947 | orchestrator | 2025-09-27 02:59:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:52.512033 | orchestrator | 2025-09-27 02:59:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:55.549623 | orchestrator | 2025-09-27 02:59:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:55.551610 | orchestrator | 2025-09-27 02:59:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:55.551702 | orchestrator | 2025-09-27 02:59:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 02:59:58.598308 | orchestrator | 2025-09-27 02:59:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 02:59:58.599989 | orchestrator | 2025-09-27 02:59:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 02:59:58.600021 | orchestrator | 2025-09-27 02:59:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:01.648136 | orchestrator | 2025-09-27 03:00:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:01.649631 | orchestrator | 2025-09-27 03:00:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:01.649659 | orchestrator | 2025-09-27 03:00:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:04.695246 | orchestrator | 2025-09-27 03:00:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:04.696417 | orchestrator | 2025-09-27 03:00:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:04.696444 | orchestrator | 2025-09-27 03:00:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:07.741171 | orchestrator | 2025-09-27 03:00:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:07.743235 | orchestrator | 2025-09-27 03:00:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:07.743658 | orchestrator | 2025-09-27 03:00:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:10.788437 | orchestrator | 2025-09-27 03:00:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:10.790288 | orchestrator | 2025-09-27 03:00:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:10.790373 | orchestrator | 2025-09-27 03:00:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:13.836269 | orchestrator | 2025-09-27 03:00:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:13.837698 | orchestrator | 2025-09-27 03:00:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:13.837736 | orchestrator | 2025-09-27 03:00:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:16.889109 | orchestrator | 2025-09-27 03:00:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:16.889679 | orchestrator | 2025-09-27 03:00:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:16.889702 | orchestrator | 2025-09-27 03:00:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:19.939282 | orchestrator | 2025-09-27 03:00:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:19.941372 | orchestrator | 2025-09-27 03:00:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:19.941443 | orchestrator | 2025-09-27 03:00:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:22.989151 | orchestrator | 2025-09-27 03:00:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:22.989830 | orchestrator | 2025-09-27 03:00:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:22.989862 | orchestrator | 2025-09-27 03:00:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:26.042948 | orchestrator | 2025-09-27 03:00:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:26.043845 | orchestrator | 2025-09-27 03:00:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:26.044050 | orchestrator | 2025-09-27 03:00:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:29.090840 | orchestrator | 2025-09-27 03:00:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:29.092401 | orchestrator | 2025-09-27 03:00:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:29.092733 | orchestrator | 2025-09-27 03:00:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:32.136188 | orchestrator | 2025-09-27 03:00:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:32.137856 | orchestrator | 2025-09-27 03:00:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:32.138344 | orchestrator | 2025-09-27 03:00:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:35.187728 | orchestrator | 2025-09-27 03:00:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:35.189030 | orchestrator | 2025-09-27 03:00:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:35.189068 | orchestrator | 2025-09-27 03:00:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:38.230195 | orchestrator | 2025-09-27 03:00:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:38.232905 | orchestrator | 2025-09-27 03:00:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:38.232933 | orchestrator | 2025-09-27 03:00:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:41.282164 | orchestrator | 2025-09-27 03:00:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:41.282934 | orchestrator | 2025-09-27 03:00:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:41.282959 | orchestrator | 2025-09-27 03:00:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:44.327235 | orchestrator | 2025-09-27 03:00:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:44.328750 | orchestrator | 2025-09-27 03:00:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:44.328782 | orchestrator | 2025-09-27 03:00:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:47.374747 | orchestrator | 2025-09-27 03:00:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:47.375661 | orchestrator | 2025-09-27 03:00:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:47.375679 | orchestrator | 2025-09-27 03:00:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:50.421125 | orchestrator | 2025-09-27 03:00:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:50.421968 | orchestrator | 2025-09-27 03:00:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:50.422247 | orchestrator | 2025-09-27 03:00:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:53.469371 | orchestrator | 2025-09-27 03:00:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:53.470673 | orchestrator | 2025-09-27 03:00:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:53.470705 | orchestrator | 2025-09-27 03:00:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:56.508675 | orchestrator | 2025-09-27 03:00:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:56.510102 | orchestrator | 2025-09-27 03:00:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:56.510136 | orchestrator | 2025-09-27 03:00:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:00:59.555498 | orchestrator | 2025-09-27 03:00:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:00:59.557501 | orchestrator | 2025-09-27 03:00:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:00:59.557619 | orchestrator | 2025-09-27 03:00:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:02.604096 | orchestrator | 2025-09-27 03:01:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:02.605698 | orchestrator | 2025-09-27 03:01:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:02.605732 | orchestrator | 2025-09-27 03:01:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:05.654479 | orchestrator | 2025-09-27 03:01:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:05.655842 | orchestrator | 2025-09-27 03:01:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:05.655924 | orchestrator | 2025-09-27 03:01:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:08.696142 | orchestrator | 2025-09-27 03:01:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:08.697435 | orchestrator | 2025-09-27 03:01:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:08.697685 | orchestrator | 2025-09-27 03:01:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:11.743132 | orchestrator | 2025-09-27 03:01:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:11.745129 | orchestrator | 2025-09-27 03:01:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:11.745233 | orchestrator | 2025-09-27 03:01:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:14.791092 | orchestrator | 2025-09-27 03:01:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:14.792533 | orchestrator | 2025-09-27 03:01:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:14.792634 | orchestrator | 2025-09-27 03:01:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:17.841685 | orchestrator | 2025-09-27 03:01:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:17.841817 | orchestrator | 2025-09-27 03:01:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:17.841832 | orchestrator | 2025-09-27 03:01:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:20.884631 | orchestrator | 2025-09-27 03:01:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:20.885716 | orchestrator | 2025-09-27 03:01:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:20.885747 | orchestrator | 2025-09-27 03:01:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:23.933188 | orchestrator | 2025-09-27 03:01:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:23.934340 | orchestrator | 2025-09-27 03:01:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:23.934371 | orchestrator | 2025-09-27 03:01:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:26.980836 | orchestrator | 2025-09-27 03:01:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:26.981658 | orchestrator | 2025-09-27 03:01:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:26.981691 | orchestrator | 2025-09-27 03:01:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:30.028781 | orchestrator | 2025-09-27 03:01:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:30.029500 | orchestrator | 2025-09-27 03:01:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:30.029933 | orchestrator | 2025-09-27 03:01:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:33.073902 | orchestrator | 2025-09-27 03:01:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:33.074875 | orchestrator | 2025-09-27 03:01:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:33.074905 | orchestrator | 2025-09-27 03:01:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:36.121830 | orchestrator | 2025-09-27 03:01:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:36.122959 | orchestrator | 2025-09-27 03:01:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:36.123061 | orchestrator | 2025-09-27 03:01:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:39.167903 | orchestrator | 2025-09-27 03:01:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:39.168641 | orchestrator | 2025-09-27 03:01:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:39.168670 | orchestrator | 2025-09-27 03:01:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:42.213479 | orchestrator | 2025-09-27 03:01:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:42.215377 | orchestrator | 2025-09-27 03:01:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:42.215406 | orchestrator | 2025-09-27 03:01:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:45.268173 | orchestrator | 2025-09-27 03:01:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:45.269141 | orchestrator | 2025-09-27 03:01:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:45.269171 | orchestrator | 2025-09-27 03:01:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:48.315824 | orchestrator | 2025-09-27 03:01:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:48.316006 | orchestrator | 2025-09-27 03:01:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:48.316029 | orchestrator | 2025-09-27 03:01:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:51.359863 | orchestrator | 2025-09-27 03:01:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:51.360862 | orchestrator | 2025-09-27 03:01:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:51.360945 | orchestrator | 2025-09-27 03:01:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:54.410436 | orchestrator | 2025-09-27 03:01:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:54.412124 | orchestrator | 2025-09-27 03:01:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:54.412152 | orchestrator | 2025-09-27 03:01:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:01:57.452329 | orchestrator | 2025-09-27 03:01:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:01:57.452432 | orchestrator | 2025-09-27 03:01:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:01:57.452442 | orchestrator | 2025-09-27 03:01:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:00.503444 | orchestrator | 2025-09-27 03:02:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:00.506426 | orchestrator | 2025-09-27 03:02:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:00.506458 | orchestrator | 2025-09-27 03:02:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:03.557296 | orchestrator | 2025-09-27 03:02:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:03.559233 | orchestrator | 2025-09-27 03:02:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:03.559339 | orchestrator | 2025-09-27 03:02:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:06.606608 | orchestrator | 2025-09-27 03:02:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:06.608503 | orchestrator | 2025-09-27 03:02:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:06.608533 | orchestrator | 2025-09-27 03:02:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:09.653358 | orchestrator | 2025-09-27 03:02:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:09.654266 | orchestrator | 2025-09-27 03:02:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:09.654303 | orchestrator | 2025-09-27 03:02:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:12.704903 | orchestrator | 2025-09-27 03:02:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:12.706320 | orchestrator | 2025-09-27 03:02:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:12.706349 | orchestrator | 2025-09-27 03:02:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:15.755674 | orchestrator | 2025-09-27 03:02:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:15.757654 | orchestrator | 2025-09-27 03:02:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:15.757685 | orchestrator | 2025-09-27 03:02:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:18.804345 | orchestrator | 2025-09-27 03:02:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:18.806114 | orchestrator | 2025-09-27 03:02:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:18.806159 | orchestrator | 2025-09-27 03:02:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:21.851296 | orchestrator | 2025-09-27 03:02:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:21.852237 | orchestrator | 2025-09-27 03:02:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:21.852311 | orchestrator | 2025-09-27 03:02:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:24.904456 | orchestrator | 2025-09-27 03:02:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:24.907205 | orchestrator | 2025-09-27 03:02:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:24.907223 | orchestrator | 2025-09-27 03:02:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:27.944123 | orchestrator | 2025-09-27 03:02:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:27.945792 | orchestrator | 2025-09-27 03:02:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:27.945837 | orchestrator | 2025-09-27 03:02:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:30.991286 | orchestrator | 2025-09-27 03:02:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:30.993020 | orchestrator | 2025-09-27 03:02:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:30.993067 | orchestrator | 2025-09-27 03:02:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:34.041129 | orchestrator | 2025-09-27 03:02:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:34.042403 | orchestrator | 2025-09-27 03:02:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:34.042436 | orchestrator | 2025-09-27 03:02:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:37.089118 | orchestrator | 2025-09-27 03:02:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:37.089409 | orchestrator | 2025-09-27 03:02:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:37.089510 | orchestrator | 2025-09-27 03:02:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:40.129265 | orchestrator | 2025-09-27 03:02:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:40.130489 | orchestrator | 2025-09-27 03:02:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:40.130525 | orchestrator | 2025-09-27 03:02:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:43.179832 | orchestrator | 2025-09-27 03:02:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:43.181371 | orchestrator | 2025-09-27 03:02:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:43.181400 | orchestrator | 2025-09-27 03:02:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:46.230786 | orchestrator | 2025-09-27 03:02:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:46.231962 | orchestrator | 2025-09-27 03:02:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:46.231989 | orchestrator | 2025-09-27 03:02:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:49.283261 | orchestrator | 2025-09-27 03:02:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:49.286330 | orchestrator | 2025-09-27 03:02:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:49.286496 | orchestrator | 2025-09-27 03:02:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:52.329344 | orchestrator | 2025-09-27 03:02:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:52.330302 | orchestrator | 2025-09-27 03:02:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:52.330339 | orchestrator | 2025-09-27 03:02:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:55.373859 | orchestrator | 2025-09-27 03:02:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:55.374966 | orchestrator | 2025-09-27 03:02:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:55.374995 | orchestrator | 2025-09-27 03:02:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:02:58.418126 | orchestrator | 2025-09-27 03:02:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:02:58.418775 | orchestrator | 2025-09-27 03:02:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:02:58.418823 | orchestrator | 2025-09-27 03:02:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:01.462661 | orchestrator | 2025-09-27 03:03:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:01.462770 | orchestrator | 2025-09-27 03:03:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:01.463001 | orchestrator | 2025-09-27 03:03:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:04.510650 | orchestrator | 2025-09-27 03:03:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:04.510774 | orchestrator | 2025-09-27 03:03:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:04.510788 | orchestrator | 2025-09-27 03:03:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:07.554968 | orchestrator | 2025-09-27 03:03:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:07.555697 | orchestrator | 2025-09-27 03:03:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:07.556523 | orchestrator | 2025-09-27 03:03:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:10.608436 | orchestrator | 2025-09-27 03:03:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:10.608572 | orchestrator | 2025-09-27 03:03:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:10.608590 | orchestrator | 2025-09-27 03:03:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:13.650256 | orchestrator | 2025-09-27 03:03:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:13.650866 | orchestrator | 2025-09-27 03:03:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:13.651183 | orchestrator | 2025-09-27 03:03:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:16.697926 | orchestrator | 2025-09-27 03:03:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:16.699743 | orchestrator | 2025-09-27 03:03:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:16.699859 | orchestrator | 2025-09-27 03:03:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:19.743276 | orchestrator | 2025-09-27 03:03:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:19.744457 | orchestrator | 2025-09-27 03:03:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:19.744581 | orchestrator | 2025-09-27 03:03:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:22.792205 | orchestrator | 2025-09-27 03:03:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:22.794789 | orchestrator | 2025-09-27 03:03:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:22.794916 | orchestrator | 2025-09-27 03:03:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:25.838866 | orchestrator | 2025-09-27 03:03:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:25.839864 | orchestrator | 2025-09-27 03:03:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:25.839895 | orchestrator | 2025-09-27 03:03:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:28.884156 | orchestrator | 2025-09-27 03:03:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:28.885254 | orchestrator | 2025-09-27 03:03:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:28.885301 | orchestrator | 2025-09-27 03:03:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:31.928659 | orchestrator | 2025-09-27 03:03:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:31.929779 | orchestrator | 2025-09-27 03:03:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:31.929813 | orchestrator | 2025-09-27 03:03:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:34.974804 | orchestrator | 2025-09-27 03:03:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:34.975385 | orchestrator | 2025-09-27 03:03:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:34.975415 | orchestrator | 2025-09-27 03:03:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:38.020787 | orchestrator | 2025-09-27 03:03:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:38.021567 | orchestrator | 2025-09-27 03:03:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:38.021597 | orchestrator | 2025-09-27 03:03:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:41.065697 | orchestrator | 2025-09-27 03:03:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:41.071078 | orchestrator | 2025-09-27 03:03:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:41.071123 | orchestrator | 2025-09-27 03:03:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:44.109265 | orchestrator | 2025-09-27 03:03:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:44.109892 | orchestrator | 2025-09-27 03:03:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:44.109924 | orchestrator | 2025-09-27 03:03:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:47.158451 | orchestrator | 2025-09-27 03:03:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:47.161529 | orchestrator | 2025-09-27 03:03:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:47.161703 | orchestrator | 2025-09-27 03:03:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:50.206325 | orchestrator | 2025-09-27 03:03:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:50.207961 | orchestrator | 2025-09-27 03:03:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:50.208304 | orchestrator | 2025-09-27 03:03:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:53.252733 | orchestrator | 2025-09-27 03:03:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:53.254748 | orchestrator | 2025-09-27 03:03:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:53.254781 | orchestrator | 2025-09-27 03:03:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:56.299057 | orchestrator | 2025-09-27 03:03:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:56.300595 | orchestrator | 2025-09-27 03:03:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:56.300807 | orchestrator | 2025-09-27 03:03:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:03:59.340806 | orchestrator | 2025-09-27 03:03:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:03:59.341997 | orchestrator | 2025-09-27 03:03:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:03:59.342334 | orchestrator | 2025-09-27 03:03:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:02.389035 | orchestrator | 2025-09-27 03:04:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:02.390524 | orchestrator | 2025-09-27 03:04:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:02.390691 | orchestrator | 2025-09-27 03:04:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:05.440176 | orchestrator | 2025-09-27 03:04:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:05.440302 | orchestrator | 2025-09-27 03:04:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:05.440317 | orchestrator | 2025-09-27 03:04:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:08.495196 | orchestrator | 2025-09-27 03:04:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:08.496706 | orchestrator | 2025-09-27 03:04:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:08.496726 | orchestrator | 2025-09-27 03:04:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:11.532695 | orchestrator | 2025-09-27 03:04:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:11.533827 | orchestrator | 2025-09-27 03:04:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:11.533858 | orchestrator | 2025-09-27 03:04:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:14.581773 | orchestrator | 2025-09-27 03:04:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:14.583404 | orchestrator | 2025-09-27 03:04:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:14.583448 | orchestrator | 2025-09-27 03:04:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:17.625930 | orchestrator | 2025-09-27 03:04:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:17.627602 | orchestrator | 2025-09-27 03:04:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:17.627658 | orchestrator | 2025-09-27 03:04:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:20.668982 | orchestrator | 2025-09-27 03:04:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:20.669650 | orchestrator | 2025-09-27 03:04:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:20.669745 | orchestrator | 2025-09-27 03:04:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:23.714420 | orchestrator | 2025-09-27 03:04:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:23.716107 | orchestrator | 2025-09-27 03:04:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:23.716137 | orchestrator | 2025-09-27 03:04:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:26.754948 | orchestrator | 2025-09-27 03:04:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:26.757126 | orchestrator | 2025-09-27 03:04:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:26.757449 | orchestrator | 2025-09-27 03:04:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:29.800604 | orchestrator | 2025-09-27 03:04:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:29.801915 | orchestrator | 2025-09-27 03:04:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:29.801941 | orchestrator | 2025-09-27 03:04:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:32.849797 | orchestrator | 2025-09-27 03:04:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:32.849891 | orchestrator | 2025-09-27 03:04:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:32.849903 | orchestrator | 2025-09-27 03:04:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:35.902334 | orchestrator | 2025-09-27 03:04:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:35.903443 | orchestrator | 2025-09-27 03:04:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:35.903517 | orchestrator | 2025-09-27 03:04:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:38.949426 | orchestrator | 2025-09-27 03:04:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:38.951762 | orchestrator | 2025-09-27 03:04:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:38.951795 | orchestrator | 2025-09-27 03:04:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:41.994325 | orchestrator | 2025-09-27 03:04:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:41.994799 | orchestrator | 2025-09-27 03:04:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:41.994829 | orchestrator | 2025-09-27 03:04:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:45.033452 | orchestrator | 2025-09-27 03:04:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:45.034459 | orchestrator | 2025-09-27 03:04:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:45.034493 | orchestrator | 2025-09-27 03:04:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:48.081836 | orchestrator | 2025-09-27 03:04:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:48.083624 | orchestrator | 2025-09-27 03:04:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:48.083658 | orchestrator | 2025-09-27 03:04:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:51.133520 | orchestrator | 2025-09-27 03:04:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:51.135507 | orchestrator | 2025-09-27 03:04:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:51.135572 | orchestrator | 2025-09-27 03:04:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:54.178914 | orchestrator | 2025-09-27 03:04:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:54.180081 | orchestrator | 2025-09-27 03:04:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:54.180132 | orchestrator | 2025-09-27 03:04:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:04:57.223359 | orchestrator | 2025-09-27 03:04:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:04:57.224188 | orchestrator | 2025-09-27 03:04:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:04:57.224557 | orchestrator | 2025-09-27 03:04:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:00.272245 | orchestrator | 2025-09-27 03:05:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:00.272987 | orchestrator | 2025-09-27 03:05:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:00.273021 | orchestrator | 2025-09-27 03:05:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:03.318284 | orchestrator | 2025-09-27 03:05:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:03.319313 | orchestrator | 2025-09-27 03:05:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:03.319452 | orchestrator | 2025-09-27 03:05:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:06.368072 | orchestrator | 2025-09-27 03:05:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:06.369046 | orchestrator | 2025-09-27 03:05:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:06.369168 | orchestrator | 2025-09-27 03:05:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:09.414783 | orchestrator | 2025-09-27 03:05:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:09.416250 | orchestrator | 2025-09-27 03:05:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:09.416284 | orchestrator | 2025-09-27 03:05:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:12.461682 | orchestrator | 2025-09-27 03:05:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:12.463882 | orchestrator | 2025-09-27 03:05:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:12.463933 | orchestrator | 2025-09-27 03:05:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:15.507986 | orchestrator | 2025-09-27 03:05:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:15.509391 | orchestrator | 2025-09-27 03:05:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:15.509454 | orchestrator | 2025-09-27 03:05:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:18.558063 | orchestrator | 2025-09-27 03:05:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:18.559963 | orchestrator | 2025-09-27 03:05:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:18.559990 | orchestrator | 2025-09-27 03:05:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:21.615883 | orchestrator | 2025-09-27 03:05:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:21.617024 | orchestrator | 2025-09-27 03:05:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:21.617250 | orchestrator | 2025-09-27 03:05:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:24.665025 | orchestrator | 2025-09-27 03:05:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:24.665949 | orchestrator | 2025-09-27 03:05:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:24.666114 | orchestrator | 2025-09-27 03:05:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:27.714078 | orchestrator | 2025-09-27 03:05:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:27.715012 | orchestrator | 2025-09-27 03:05:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:27.715045 | orchestrator | 2025-09-27 03:05:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:30.761746 | orchestrator | 2025-09-27 03:05:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:30.762087 | orchestrator | 2025-09-27 03:05:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:30.762186 | orchestrator | 2025-09-27 03:05:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:33.811402 | orchestrator | 2025-09-27 03:05:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:33.811838 | orchestrator | 2025-09-27 03:05:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:33.811871 | orchestrator | 2025-09-27 03:05:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:36.860196 | orchestrator | 2025-09-27 03:05:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:36.860677 | orchestrator | 2025-09-27 03:05:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:36.860709 | orchestrator | 2025-09-27 03:05:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:39.908896 | orchestrator | 2025-09-27 03:05:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:39.909874 | orchestrator | 2025-09-27 03:05:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:39.910130 | orchestrator | 2025-09-27 03:05:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:42.954805 | orchestrator | 2025-09-27 03:05:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:42.954978 | orchestrator | 2025-09-27 03:05:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:42.954999 | orchestrator | 2025-09-27 03:05:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:45.998095 | orchestrator | 2025-09-27 03:05:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:45.999890 | orchestrator | 2025-09-27 03:05:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:45.999922 | orchestrator | 2025-09-27 03:05:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:49.043481 | orchestrator | 2025-09-27 03:05:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:49.043737 | orchestrator | 2025-09-27 03:05:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:49.043759 | orchestrator | 2025-09-27 03:05:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:52.092893 | orchestrator | 2025-09-27 03:05:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:52.093741 | orchestrator | 2025-09-27 03:05:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:52.093834 | orchestrator | 2025-09-27 03:05:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:55.134328 | orchestrator | 2025-09-27 03:05:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:55.134797 | orchestrator | 2025-09-27 03:05:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:55.134830 | orchestrator | 2025-09-27 03:05:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:05:58.178784 | orchestrator | 2025-09-27 03:05:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:05:58.181051 | orchestrator | 2025-09-27 03:05:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:05:58.181182 | orchestrator | 2025-09-27 03:05:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:01.232414 | orchestrator | 2025-09-27 03:06:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:01.235033 | orchestrator | 2025-09-27 03:06:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:01.235125 | orchestrator | 2025-09-27 03:06:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:04.284049 | orchestrator | 2025-09-27 03:06:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:04.286949 | orchestrator | 2025-09-27 03:06:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:04.287018 | orchestrator | 2025-09-27 03:06:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:07.338643 | orchestrator | 2025-09-27 03:06:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:07.341091 | orchestrator | 2025-09-27 03:06:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:07.341126 | orchestrator | 2025-09-27 03:06:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:10.386726 | orchestrator | 2025-09-27 03:06:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:10.388307 | orchestrator | 2025-09-27 03:06:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:10.388378 | orchestrator | 2025-09-27 03:06:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:13.434381 | orchestrator | 2025-09-27 03:06:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:13.435941 | orchestrator | 2025-09-27 03:06:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:13.435976 | orchestrator | 2025-09-27 03:06:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:16.485022 | orchestrator | 2025-09-27 03:06:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:16.486762 | orchestrator | 2025-09-27 03:06:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:16.486801 | orchestrator | 2025-09-27 03:06:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:19.534415 | orchestrator | 2025-09-27 03:06:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:19.535700 | orchestrator | 2025-09-27 03:06:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:19.535736 | orchestrator | 2025-09-27 03:06:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:22.583848 | orchestrator | 2025-09-27 03:06:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:22.584106 | orchestrator | 2025-09-27 03:06:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:22.584205 | orchestrator | 2025-09-27 03:06:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:25.630227 | orchestrator | 2025-09-27 03:06:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:25.630857 | orchestrator | 2025-09-27 03:06:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:25.631172 | orchestrator | 2025-09-27 03:06:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:28.678891 | orchestrator | 2025-09-27 03:06:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:28.679603 | orchestrator | 2025-09-27 03:06:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:28.679734 | orchestrator | 2025-09-27 03:06:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:31.730362 | orchestrator | 2025-09-27 03:06:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:31.731115 | orchestrator | 2025-09-27 03:06:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:31.731383 | orchestrator | 2025-09-27 03:06:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:34.776252 | orchestrator | 2025-09-27 03:06:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:34.776799 | orchestrator | 2025-09-27 03:06:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:34.776833 | orchestrator | 2025-09-27 03:06:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:37.826114 | orchestrator | 2025-09-27 03:06:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:37.826719 | orchestrator | 2025-09-27 03:06:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:37.827005 | orchestrator | 2025-09-27 03:06:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:40.874283 | orchestrator | 2025-09-27 03:06:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:40.876605 | orchestrator | 2025-09-27 03:06:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:40.877004 | orchestrator | 2025-09-27 03:06:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:43.921755 | orchestrator | 2025-09-27 03:06:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:43.923152 | orchestrator | 2025-09-27 03:06:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:43.923275 | orchestrator | 2025-09-27 03:06:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:46.972233 | orchestrator | 2025-09-27 03:06:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:46.974661 | orchestrator | 2025-09-27 03:06:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:46.974698 | orchestrator | 2025-09-27 03:06:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:50.022170 | orchestrator | 2025-09-27 03:06:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:50.022359 | orchestrator | 2025-09-27 03:06:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:50.022381 | orchestrator | 2025-09-27 03:06:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:53.065671 | orchestrator | 2025-09-27 03:06:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:53.066891 | orchestrator | 2025-09-27 03:06:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:53.066929 | orchestrator | 2025-09-27 03:06:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:56.112436 | orchestrator | 2025-09-27 03:06:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:56.113912 | orchestrator | 2025-09-27 03:06:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:56.113950 | orchestrator | 2025-09-27 03:06:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:06:59.168913 | orchestrator | 2025-09-27 03:06:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:06:59.171108 | orchestrator | 2025-09-27 03:06:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:06:59.171193 | orchestrator | 2025-09-27 03:06:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:02.217163 | orchestrator | 2025-09-27 03:07:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:02.220217 | orchestrator | 2025-09-27 03:07:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:02.220250 | orchestrator | 2025-09-27 03:07:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:05.271619 | orchestrator | 2025-09-27 03:07:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:05.273498 | orchestrator | 2025-09-27 03:07:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:05.273536 | orchestrator | 2025-09-27 03:07:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:08.332787 | orchestrator | 2025-09-27 03:07:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:08.332962 | orchestrator | 2025-09-27 03:07:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:08.332978 | orchestrator | 2025-09-27 03:07:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:11.378984 | orchestrator | 2025-09-27 03:07:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:11.380566 | orchestrator | 2025-09-27 03:07:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:11.380746 | orchestrator | 2025-09-27 03:07:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:14.425011 | orchestrator | 2025-09-27 03:07:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:14.426578 | orchestrator | 2025-09-27 03:07:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:14.426627 | orchestrator | 2025-09-27 03:07:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:17.468583 | orchestrator | 2025-09-27 03:07:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:17.469568 | orchestrator | 2025-09-27 03:07:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:17.469600 | orchestrator | 2025-09-27 03:07:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:20.513020 | orchestrator | 2025-09-27 03:07:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:20.513302 | orchestrator | 2025-09-27 03:07:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:20.513325 | orchestrator | 2025-09-27 03:07:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:23.558608 | orchestrator | 2025-09-27 03:07:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:23.559264 | orchestrator | 2025-09-27 03:07:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:23.559310 | orchestrator | 2025-09-27 03:07:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:26.606505 | orchestrator | 2025-09-27 03:07:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:26.607523 | orchestrator | 2025-09-27 03:07:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:26.607557 | orchestrator | 2025-09-27 03:07:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:29.651169 | orchestrator | 2025-09-27 03:07:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:29.653501 | orchestrator | 2025-09-27 03:07:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:29.653616 | orchestrator | 2025-09-27 03:07:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:32.699877 | orchestrator | 2025-09-27 03:07:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:32.700836 | orchestrator | 2025-09-27 03:07:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:32.700906 | orchestrator | 2025-09-27 03:07:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:35.746367 | orchestrator | 2025-09-27 03:07:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:35.746838 | orchestrator | 2025-09-27 03:07:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:35.746869 | orchestrator | 2025-09-27 03:07:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:38.797134 | orchestrator | 2025-09-27 03:07:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:38.797554 | orchestrator | 2025-09-27 03:07:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:38.797588 | orchestrator | 2025-09-27 03:07:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:41.844272 | orchestrator | 2025-09-27 03:07:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:41.844902 | orchestrator | 2025-09-27 03:07:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:41.844936 | orchestrator | 2025-09-27 03:07:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:44.896905 | orchestrator | 2025-09-27 03:07:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:44.898956 | orchestrator | 2025-09-27 03:07:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:44.898989 | orchestrator | 2025-09-27 03:07:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:47.947487 | orchestrator | 2025-09-27 03:07:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:47.947934 | orchestrator | 2025-09-27 03:07:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:47.949320 | orchestrator | 2025-09-27 03:07:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:50.995797 | orchestrator | 2025-09-27 03:07:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:50.996516 | orchestrator | 2025-09-27 03:07:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:50.996603 | orchestrator | 2025-09-27 03:07:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:54.042889 | orchestrator | 2025-09-27 03:07:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:54.045076 | orchestrator | 2025-09-27 03:07:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:54.045111 | orchestrator | 2025-09-27 03:07:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:07:57.090315 | orchestrator | 2025-09-27 03:07:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:07:57.091707 | orchestrator | 2025-09-27 03:07:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:07:57.091741 | orchestrator | 2025-09-27 03:07:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:00.145602 | orchestrator | 2025-09-27 03:08:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:00.145705 | orchestrator | 2025-09-27 03:08:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:00.145804 | orchestrator | 2025-09-27 03:08:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:03.193723 | orchestrator | 2025-09-27 03:08:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:03.195378 | orchestrator | 2025-09-27 03:08:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:03.195751 | orchestrator | 2025-09-27 03:08:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:06.245941 | orchestrator | 2025-09-27 03:08:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:06.246447 | orchestrator | 2025-09-27 03:08:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:06.246584 | orchestrator | 2025-09-27 03:08:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:09.296751 | orchestrator | 2025-09-27 03:08:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:09.297757 | orchestrator | 2025-09-27 03:08:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:09.298097 | orchestrator | 2025-09-27 03:08:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:12.338578 | orchestrator | 2025-09-27 03:08:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:12.339613 | orchestrator | 2025-09-27 03:08:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:12.339674 | orchestrator | 2025-09-27 03:08:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:15.383551 | orchestrator | 2025-09-27 03:08:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:15.385384 | orchestrator | 2025-09-27 03:08:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:15.385442 | orchestrator | 2025-09-27 03:08:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:18.438086 | orchestrator | 2025-09-27 03:08:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:18.438752 | orchestrator | 2025-09-27 03:08:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:18.439003 | orchestrator | 2025-09-27 03:08:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:21.479185 | orchestrator | 2025-09-27 03:08:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:21.481584 | orchestrator | 2025-09-27 03:08:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:21.481610 | orchestrator | 2025-09-27 03:08:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:24.527260 | orchestrator | 2025-09-27 03:08:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:24.528963 | orchestrator | 2025-09-27 03:08:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:24.529036 | orchestrator | 2025-09-27 03:08:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:27.577271 | orchestrator | 2025-09-27 03:08:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:27.579688 | orchestrator | 2025-09-27 03:08:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:27.579718 | orchestrator | 2025-09-27 03:08:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:30.625277 | orchestrator | 2025-09-27 03:08:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:30.627017 | orchestrator | 2025-09-27 03:08:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:30.627224 | orchestrator | 2025-09-27 03:08:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:33.677083 | orchestrator | 2025-09-27 03:08:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:33.678656 | orchestrator | 2025-09-27 03:08:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:33.678835 | orchestrator | 2025-09-27 03:08:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:36.725844 | orchestrator | 2025-09-27 03:08:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:36.726709 | orchestrator | 2025-09-27 03:08:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:36.726841 | orchestrator | 2025-09-27 03:08:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:39.777623 | orchestrator | 2025-09-27 03:08:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:39.778625 | orchestrator | 2025-09-27 03:08:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:39.778858 | orchestrator | 2025-09-27 03:08:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:42.822599 | orchestrator | 2025-09-27 03:08:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:42.823928 | orchestrator | 2025-09-27 03:08:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:42.824124 | orchestrator | 2025-09-27 03:08:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:45.870618 | orchestrator | 2025-09-27 03:08:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:45.872564 | orchestrator | 2025-09-27 03:08:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:45.872594 | orchestrator | 2025-09-27 03:08:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:48.918291 | orchestrator | 2025-09-27 03:08:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:48.919565 | orchestrator | 2025-09-27 03:08:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:48.919646 | orchestrator | 2025-09-27 03:08:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:51.967490 | orchestrator | 2025-09-27 03:08:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:51.967838 | orchestrator | 2025-09-27 03:08:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:51.968085 | orchestrator | 2025-09-27 03:08:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:55.017153 | orchestrator | 2025-09-27 03:08:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:55.018609 | orchestrator | 2025-09-27 03:08:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:55.018643 | orchestrator | 2025-09-27 03:08:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:08:58.067660 | orchestrator | 2025-09-27 03:08:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:08:58.067889 | orchestrator | 2025-09-27 03:08:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:08:58.067912 | orchestrator | 2025-09-27 03:08:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:01.110719 | orchestrator | 2025-09-27 03:09:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:01.111961 | orchestrator | 2025-09-27 03:09:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:01.111992 | orchestrator | 2025-09-27 03:09:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:04.157601 | orchestrator | 2025-09-27 03:09:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:04.159614 | orchestrator | 2025-09-27 03:09:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:04.159658 | orchestrator | 2025-09-27 03:09:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:07.205608 | orchestrator | 2025-09-27 03:09:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:07.207143 | orchestrator | 2025-09-27 03:09:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:07.207222 | orchestrator | 2025-09-27 03:09:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:10.250214 | orchestrator | 2025-09-27 03:09:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:10.251574 | orchestrator | 2025-09-27 03:09:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:10.251612 | orchestrator | 2025-09-27 03:09:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:13.304102 | orchestrator | 2025-09-27 03:09:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:13.304932 | orchestrator | 2025-09-27 03:09:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:13.304962 | orchestrator | 2025-09-27 03:09:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:16.349943 | orchestrator | 2025-09-27 03:09:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:16.351940 | orchestrator | 2025-09-27 03:09:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:16.352187 | orchestrator | 2025-09-27 03:09:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:19.399327 | orchestrator | 2025-09-27 03:09:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:19.400197 | orchestrator | 2025-09-27 03:09:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:19.400232 | orchestrator | 2025-09-27 03:09:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:22.447723 | orchestrator | 2025-09-27 03:09:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:22.448714 | orchestrator | 2025-09-27 03:09:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:22.448746 | orchestrator | 2025-09-27 03:09:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:25.499741 | orchestrator | 2025-09-27 03:09:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:25.500706 | orchestrator | 2025-09-27 03:09:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:25.500804 | orchestrator | 2025-09-27 03:09:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:28.545840 | orchestrator | 2025-09-27 03:09:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:28.547247 | orchestrator | 2025-09-27 03:09:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:28.548234 | orchestrator | 2025-09-27 03:09:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:31.595014 | orchestrator | 2025-09-27 03:09:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:31.596949 | orchestrator | 2025-09-27 03:09:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:31.597521 | orchestrator | 2025-09-27 03:09:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:34.642712 | orchestrator | 2025-09-27 03:09:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:34.643533 | orchestrator | 2025-09-27 03:09:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:34.643725 | orchestrator | 2025-09-27 03:09:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:37.695007 | orchestrator | 2025-09-27 03:09:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:37.696820 | orchestrator | 2025-09-27 03:09:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:37.696934 | orchestrator | 2025-09-27 03:09:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:40.747161 | orchestrator | 2025-09-27 03:09:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:40.748856 | orchestrator | 2025-09-27 03:09:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:40.748888 | orchestrator | 2025-09-27 03:09:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:43.790385 | orchestrator | 2025-09-27 03:09:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:43.791029 | orchestrator | 2025-09-27 03:09:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:43.791063 | orchestrator | 2025-09-27 03:09:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:46.836977 | orchestrator | 2025-09-27 03:09:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:46.838384 | orchestrator | 2025-09-27 03:09:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:46.838603 | orchestrator | 2025-09-27 03:09:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:49.884131 | orchestrator | 2025-09-27 03:09:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:49.884861 | orchestrator | 2025-09-27 03:09:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:49.884901 | orchestrator | 2025-09-27 03:09:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:52.928175 | orchestrator | 2025-09-27 03:09:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:52.929535 | orchestrator | 2025-09-27 03:09:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:52.929572 | orchestrator | 2025-09-27 03:09:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:55.981387 | orchestrator | 2025-09-27 03:09:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:55.982427 | orchestrator | 2025-09-27 03:09:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:55.982525 | orchestrator | 2025-09-27 03:09:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:09:59.032771 | orchestrator | 2025-09-27 03:09:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:09:59.032874 | orchestrator | 2025-09-27 03:09:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:09:59.032954 | orchestrator | 2025-09-27 03:09:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:02.081823 | orchestrator | 2025-09-27 03:10:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:02.084293 | orchestrator | 2025-09-27 03:10:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:02.084355 | orchestrator | 2025-09-27 03:10:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:05.130920 | orchestrator | 2025-09-27 03:10:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:05.131282 | orchestrator | 2025-09-27 03:10:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:05.131536 | orchestrator | 2025-09-27 03:10:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:08.175576 | orchestrator | 2025-09-27 03:10:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:08.176278 | orchestrator | 2025-09-27 03:10:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:08.176374 | orchestrator | 2025-09-27 03:10:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:11.221594 | orchestrator | 2025-09-27 03:10:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:11.222789 | orchestrator | 2025-09-27 03:10:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:11.222856 | orchestrator | 2025-09-27 03:10:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:14.273060 | orchestrator | 2025-09-27 03:10:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:14.273243 | orchestrator | 2025-09-27 03:10:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:14.273351 | orchestrator | 2025-09-27 03:10:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:17.320220 | orchestrator | 2025-09-27 03:10:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:17.321650 | orchestrator | 2025-09-27 03:10:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:17.321862 | orchestrator | 2025-09-27 03:10:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:20.371178 | orchestrator | 2025-09-27 03:10:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:20.373374 | orchestrator | 2025-09-27 03:10:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:20.374145 | orchestrator | 2025-09-27 03:10:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:23.420595 | orchestrator | 2025-09-27 03:10:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:23.421342 | orchestrator | 2025-09-27 03:10:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:23.421367 | orchestrator | 2025-09-27 03:10:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:26.470272 | orchestrator | 2025-09-27 03:10:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:26.471452 | orchestrator | 2025-09-27 03:10:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:26.471486 | orchestrator | 2025-09-27 03:10:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:29.521769 | orchestrator | 2025-09-27 03:10:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:29.522850 | orchestrator | 2025-09-27 03:10:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:29.522895 | orchestrator | 2025-09-27 03:10:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:32.566344 | orchestrator | 2025-09-27 03:10:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:32.567775 | orchestrator | 2025-09-27 03:10:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:32.567887 | orchestrator | 2025-09-27 03:10:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:35.610849 | orchestrator | 2025-09-27 03:10:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:35.613220 | orchestrator | 2025-09-27 03:10:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:35.613248 | orchestrator | 2025-09-27 03:10:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:38.654192 | orchestrator | 2025-09-27 03:10:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:38.655740 | orchestrator | 2025-09-27 03:10:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:38.655771 | orchestrator | 2025-09-27 03:10:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:41.710330 | orchestrator | 2025-09-27 03:10:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:41.711966 | orchestrator | 2025-09-27 03:10:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:41.712001 | orchestrator | 2025-09-27 03:10:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:44.755096 | orchestrator | 2025-09-27 03:10:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:44.755994 | orchestrator | 2025-09-27 03:10:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:44.756026 | orchestrator | 2025-09-27 03:10:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:47.806343 | orchestrator | 2025-09-27 03:10:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:47.807564 | orchestrator | 2025-09-27 03:10:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:47.807597 | orchestrator | 2025-09-27 03:10:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:50.853951 | orchestrator | 2025-09-27 03:10:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:50.855175 | orchestrator | 2025-09-27 03:10:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:50.855212 | orchestrator | 2025-09-27 03:10:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:53.904811 | orchestrator | 2025-09-27 03:10:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:53.907326 | orchestrator | 2025-09-27 03:10:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:53.907491 | orchestrator | 2025-09-27 03:10:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:10:56.953392 | orchestrator | 2025-09-27 03:10:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:10:56.955252 | orchestrator | 2025-09-27 03:10:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:10:56.955383 | orchestrator | 2025-09-27 03:10:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:00.001503 | orchestrator | 2025-09-27 03:11:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:00.004577 | orchestrator | 2025-09-27 03:11:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:00.004606 | orchestrator | 2025-09-27 03:11:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:03.049513 | orchestrator | 2025-09-27 03:11:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:03.050583 | orchestrator | 2025-09-27 03:11:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:03.050924 | orchestrator | 2025-09-27 03:11:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:06.092129 | orchestrator | 2025-09-27 03:11:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:06.092604 | orchestrator | 2025-09-27 03:11:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:06.092673 | orchestrator | 2025-09-27 03:11:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:09.135821 | orchestrator | 2025-09-27 03:11:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:09.137622 | orchestrator | 2025-09-27 03:11:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:09.137653 | orchestrator | 2025-09-27 03:11:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:12.187671 | orchestrator | 2025-09-27 03:11:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:12.188104 | orchestrator | 2025-09-27 03:11:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:12.188205 | orchestrator | 2025-09-27 03:11:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:15.232900 | orchestrator | 2025-09-27 03:11:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:15.234450 | orchestrator | 2025-09-27 03:11:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:15.235079 | orchestrator | 2025-09-27 03:11:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:18.278101 | orchestrator | 2025-09-27 03:11:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:18.278931 | orchestrator | 2025-09-27 03:11:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:18.279027 | orchestrator | 2025-09-27 03:11:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:21.321200 | orchestrator | 2025-09-27 03:11:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:21.322997 | orchestrator | 2025-09-27 03:11:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:21.323102 | orchestrator | 2025-09-27 03:11:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:24.372821 | orchestrator | 2025-09-27 03:11:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:24.373421 | orchestrator | 2025-09-27 03:11:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:24.373440 | orchestrator | 2025-09-27 03:11:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:27.422832 | orchestrator | 2025-09-27 03:11:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:27.423141 | orchestrator | 2025-09-27 03:11:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:27.423221 | orchestrator | 2025-09-27 03:11:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:30.469763 | orchestrator | 2025-09-27 03:11:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:30.470247 | orchestrator | 2025-09-27 03:11:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:30.470314 | orchestrator | 2025-09-27 03:11:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:33.515880 | orchestrator | 2025-09-27 03:11:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:33.517193 | orchestrator | 2025-09-27 03:11:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:33.517232 | orchestrator | 2025-09-27 03:11:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:36.560524 | orchestrator | 2025-09-27 03:11:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:36.561324 | orchestrator | 2025-09-27 03:11:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:36.561468 | orchestrator | 2025-09-27 03:11:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:39.609556 | orchestrator | 2025-09-27 03:11:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:39.611089 | orchestrator | 2025-09-27 03:11:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:39.611357 | orchestrator | 2025-09-27 03:11:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:42.663826 | orchestrator | 2025-09-27 03:11:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:42.665521 | orchestrator | 2025-09-27 03:11:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:42.665768 | orchestrator | 2025-09-27 03:11:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:45.711671 | orchestrator | 2025-09-27 03:11:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:45.713681 | orchestrator | 2025-09-27 03:11:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:45.713707 | orchestrator | 2025-09-27 03:11:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:48.755704 | orchestrator | 2025-09-27 03:11:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:48.756089 | orchestrator | 2025-09-27 03:11:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:48.756349 | orchestrator | 2025-09-27 03:11:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:51.804401 | orchestrator | 2025-09-27 03:11:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:51.806181 | orchestrator | 2025-09-27 03:11:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:51.806234 | orchestrator | 2025-09-27 03:11:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:54.852322 | orchestrator | 2025-09-27 03:11:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:54.852593 | orchestrator | 2025-09-27 03:11:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:54.852620 | orchestrator | 2025-09-27 03:11:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:11:57.901449 | orchestrator | 2025-09-27 03:11:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:11:57.902813 | orchestrator | 2025-09-27 03:11:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:11:57.902845 | orchestrator | 2025-09-27 03:11:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:00.948837 | orchestrator | 2025-09-27 03:12:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:00.950382 | orchestrator | 2025-09-27 03:12:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:00.950609 | orchestrator | 2025-09-27 03:12:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:03.992063 | orchestrator | 2025-09-27 03:12:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:03.994728 | orchestrator | 2025-09-27 03:12:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:03.994756 | orchestrator | 2025-09-27 03:12:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:07.040860 | orchestrator | 2025-09-27 03:12:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:07.042140 | orchestrator | 2025-09-27 03:12:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:07.042319 | orchestrator | 2025-09-27 03:12:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:10.085531 | orchestrator | 2025-09-27 03:12:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:10.087540 | orchestrator | 2025-09-27 03:12:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:10.087660 | orchestrator | 2025-09-27 03:12:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:13.138718 | orchestrator | 2025-09-27 03:12:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:13.140149 | orchestrator | 2025-09-27 03:12:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:13.140179 | orchestrator | 2025-09-27 03:12:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:16.178239 | orchestrator | 2025-09-27 03:12:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:16.178478 | orchestrator | 2025-09-27 03:12:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:16.178500 | orchestrator | 2025-09-27 03:12:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:19.224136 | orchestrator | 2025-09-27 03:12:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:19.224550 | orchestrator | 2025-09-27 03:12:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:19.224579 | orchestrator | 2025-09-27 03:12:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:22.265745 | orchestrator | 2025-09-27 03:12:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:22.266398 | orchestrator | 2025-09-27 03:12:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:22.266433 | orchestrator | 2025-09-27 03:12:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:25.320220 | orchestrator | 2025-09-27 03:12:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:25.325554 | orchestrator | 2025-09-27 03:12:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:25.325582 | orchestrator | 2025-09-27 03:12:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:28.358293 | orchestrator | 2025-09-27 03:12:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:28.361687 | orchestrator | 2025-09-27 03:12:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:28.361710 | orchestrator | 2025-09-27 03:12:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:31.404616 | orchestrator | 2025-09-27 03:12:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:31.405120 | orchestrator | 2025-09-27 03:12:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:31.405315 | orchestrator | 2025-09-27 03:12:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:34.452461 | orchestrator | 2025-09-27 03:12:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:34.454402 | orchestrator | 2025-09-27 03:12:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:34.454432 | orchestrator | 2025-09-27 03:12:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:37.500491 | orchestrator | 2025-09-27 03:12:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:37.501672 | orchestrator | 2025-09-27 03:12:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:37.501702 | orchestrator | 2025-09-27 03:12:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:40.548681 | orchestrator | 2025-09-27 03:12:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:40.549059 | orchestrator | 2025-09-27 03:12:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:40.549252 | orchestrator | 2025-09-27 03:12:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:43.597683 | orchestrator | 2025-09-27 03:12:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:43.598887 | orchestrator | 2025-09-27 03:12:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:43.599039 | orchestrator | 2025-09-27 03:12:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:46.641688 | orchestrator | 2025-09-27 03:12:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:46.643478 | orchestrator | 2025-09-27 03:12:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:46.643708 | orchestrator | 2025-09-27 03:12:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:49.693442 | orchestrator | 2025-09-27 03:12:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:49.694868 | orchestrator | 2025-09-27 03:12:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:49.694895 | orchestrator | 2025-09-27 03:12:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:52.740535 | orchestrator | 2025-09-27 03:12:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:52.740897 | orchestrator | 2025-09-27 03:12:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:52.741746 | orchestrator | 2025-09-27 03:12:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:55.784153 | orchestrator | 2025-09-27 03:12:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:55.785522 | orchestrator | 2025-09-27 03:12:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:55.785907 | orchestrator | 2025-09-27 03:12:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:12:58.842990 | orchestrator | 2025-09-27 03:12:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:12:58.844624 | orchestrator | 2025-09-27 03:12:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:12:58.844653 | orchestrator | 2025-09-27 03:12:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:01.885874 | orchestrator | 2025-09-27 03:13:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:01.886543 | orchestrator | 2025-09-27 03:13:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:01.886573 | orchestrator | 2025-09-27 03:13:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:04.933363 | orchestrator | 2025-09-27 03:13:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:04.937413 | orchestrator | 2025-09-27 03:13:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:04.938508 | orchestrator | 2025-09-27 03:13:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:07.982691 | orchestrator | 2025-09-27 03:13:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:07.984021 | orchestrator | 2025-09-27 03:13:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:07.984364 | orchestrator | 2025-09-27 03:13:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:11.025428 | orchestrator | 2025-09-27 03:13:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:11.025671 | orchestrator | 2025-09-27 03:13:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:11.025694 | orchestrator | 2025-09-27 03:13:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:14.074777 | orchestrator | 2025-09-27 03:13:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:14.075917 | orchestrator | 2025-09-27 03:13:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:14.075940 | orchestrator | 2025-09-27 03:13:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:17.124560 | orchestrator | 2025-09-27 03:13:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:17.126120 | orchestrator | 2025-09-27 03:13:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:17.126154 | orchestrator | 2025-09-27 03:13:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:20.184051 | orchestrator | 2025-09-27 03:13:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:20.185879 | orchestrator | 2025-09-27 03:13:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:20.185909 | orchestrator | 2025-09-27 03:13:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:23.226932 | orchestrator | 2025-09-27 03:13:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:23.227824 | orchestrator | 2025-09-27 03:13:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:23.227851 | orchestrator | 2025-09-27 03:13:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:26.267818 | orchestrator | 2025-09-27 03:13:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:26.270161 | orchestrator | 2025-09-27 03:13:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:26.270346 | orchestrator | 2025-09-27 03:13:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:29.316931 | orchestrator | 2025-09-27 03:13:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:29.319803 | orchestrator | 2025-09-27 03:13:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:29.320004 | orchestrator | 2025-09-27 03:13:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:32.375349 | orchestrator | 2025-09-27 03:13:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:32.378426 | orchestrator | 2025-09-27 03:13:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:32.378456 | orchestrator | 2025-09-27 03:13:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:35.424423 | orchestrator | 2025-09-27 03:13:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:35.425141 | orchestrator | 2025-09-27 03:13:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:35.425160 | orchestrator | 2025-09-27 03:13:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:38.467154 | orchestrator | 2025-09-27 03:13:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:38.468292 | orchestrator | 2025-09-27 03:13:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:38.468323 | orchestrator | 2025-09-27 03:13:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:41.515158 | orchestrator | 2025-09-27 03:13:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:41.515813 | orchestrator | 2025-09-27 03:13:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:41.515843 | orchestrator | 2025-09-27 03:13:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:44.560440 | orchestrator | 2025-09-27 03:13:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:44.561039 | orchestrator | 2025-09-27 03:13:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:44.561175 | orchestrator | 2025-09-27 03:13:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:47.606825 | orchestrator | 2025-09-27 03:13:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:47.608270 | orchestrator | 2025-09-27 03:13:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:47.608497 | orchestrator | 2025-09-27 03:13:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:50.654840 | orchestrator | 2025-09-27 03:13:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:50.655866 | orchestrator | 2025-09-27 03:13:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:50.655891 | orchestrator | 2025-09-27 03:13:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:53.703464 | orchestrator | 2025-09-27 03:13:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:53.704633 | orchestrator | 2025-09-27 03:13:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:53.704836 | orchestrator | 2025-09-27 03:13:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:56.750627 | orchestrator | 2025-09-27 03:13:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:56.751448 | orchestrator | 2025-09-27 03:13:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:56.751675 | orchestrator | 2025-09-27 03:13:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:13:59.795908 | orchestrator | 2025-09-27 03:13:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:13:59.797428 | orchestrator | 2025-09-27 03:13:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:13:59.797465 | orchestrator | 2025-09-27 03:13:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:02.848706 | orchestrator | 2025-09-27 03:14:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:02.849660 | orchestrator | 2025-09-27 03:14:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:02.849691 | orchestrator | 2025-09-27 03:14:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:05.892072 | orchestrator | 2025-09-27 03:14:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:05.892880 | orchestrator | 2025-09-27 03:14:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:05.892911 | orchestrator | 2025-09-27 03:14:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:08.935727 | orchestrator | 2025-09-27 03:14:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:08.937254 | orchestrator | 2025-09-27 03:14:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:08.937284 | orchestrator | 2025-09-27 03:14:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:11.988801 | orchestrator | 2025-09-27 03:14:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:11.989807 | orchestrator | 2025-09-27 03:14:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:11.989898 | orchestrator | 2025-09-27 03:14:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:15.032435 | orchestrator | 2025-09-27 03:14:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:15.034092 | orchestrator | 2025-09-27 03:14:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:15.034124 | orchestrator | 2025-09-27 03:14:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:18.083350 | orchestrator | 2025-09-27 03:14:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:18.084632 | orchestrator | 2025-09-27 03:14:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:18.084662 | orchestrator | 2025-09-27 03:14:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:21.125465 | orchestrator | 2025-09-27 03:14:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:21.127286 | orchestrator | 2025-09-27 03:14:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:21.127332 | orchestrator | 2025-09-27 03:14:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:24.174521 | orchestrator | 2025-09-27 03:14:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:24.176128 | orchestrator | 2025-09-27 03:14:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:24.176156 | orchestrator | 2025-09-27 03:14:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:27.217911 | orchestrator | 2025-09-27 03:14:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:27.218665 | orchestrator | 2025-09-27 03:14:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:27.218882 | orchestrator | 2025-09-27 03:14:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:30.266412 | orchestrator | 2025-09-27 03:14:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:30.266608 | orchestrator | 2025-09-27 03:14:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:30.266629 | orchestrator | 2025-09-27 03:14:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:33.319825 | orchestrator | 2025-09-27 03:14:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:33.320361 | orchestrator | 2025-09-27 03:14:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:33.320388 | orchestrator | 2025-09-27 03:14:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:36.363803 | orchestrator | 2025-09-27 03:14:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:36.364597 | orchestrator | 2025-09-27 03:14:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:36.364705 | orchestrator | 2025-09-27 03:14:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:39.411052 | orchestrator | 2025-09-27 03:14:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:39.413382 | orchestrator | 2025-09-27 03:14:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:39.413419 | orchestrator | 2025-09-27 03:14:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:42.456838 | orchestrator | 2025-09-27 03:14:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:42.458435 | orchestrator | 2025-09-27 03:14:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:42.458464 | orchestrator | 2025-09-27 03:14:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:45.502262 | orchestrator | 2025-09-27 03:14:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:45.503672 | orchestrator | 2025-09-27 03:14:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:45.504526 | orchestrator | 2025-09-27 03:14:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:48.548515 | orchestrator | 2025-09-27 03:14:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:48.548691 | orchestrator | 2025-09-27 03:14:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:48.549392 | orchestrator | 2025-09-27 03:14:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:51.592635 | orchestrator | 2025-09-27 03:14:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:51.595847 | orchestrator | 2025-09-27 03:14:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:51.595874 | orchestrator | 2025-09-27 03:14:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:54.644326 | orchestrator | 2025-09-27 03:14:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:54.645520 | orchestrator | 2025-09-27 03:14:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:54.645809 | orchestrator | 2025-09-27 03:14:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:14:57.690447 | orchestrator | 2025-09-27 03:14:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:14:57.691431 | orchestrator | 2025-09-27 03:14:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:14:57.691460 | orchestrator | 2025-09-27 03:14:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:00.739063 | orchestrator | 2025-09-27 03:15:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:00.740406 | orchestrator | 2025-09-27 03:15:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:00.740434 | orchestrator | 2025-09-27 03:15:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:03.789533 | orchestrator | 2025-09-27 03:15:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:03.789844 | orchestrator | 2025-09-27 03:15:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:03.789872 | orchestrator | 2025-09-27 03:15:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:06.832303 | orchestrator | 2025-09-27 03:15:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:06.832903 | orchestrator | 2025-09-27 03:15:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:06.832932 | orchestrator | 2025-09-27 03:15:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:09.877974 | orchestrator | 2025-09-27 03:15:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:09.879639 | orchestrator | 2025-09-27 03:15:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:09.880243 | orchestrator | 2025-09-27 03:15:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:12.924707 | orchestrator | 2025-09-27 03:15:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:12.927068 | orchestrator | 2025-09-27 03:15:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:12.927362 | orchestrator | 2025-09-27 03:15:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:15.977037 | orchestrator | 2025-09-27 03:15:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:15.978486 | orchestrator | 2025-09-27 03:15:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:15.978561 | orchestrator | 2025-09-27 03:15:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:19.027095 | orchestrator | 2025-09-27 03:15:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:19.028402 | orchestrator | 2025-09-27 03:15:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:19.028581 | orchestrator | 2025-09-27 03:15:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:22.071117 | orchestrator | 2025-09-27 03:15:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:22.072204 | orchestrator | 2025-09-27 03:15:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:22.072462 | orchestrator | 2025-09-27 03:15:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:25.111282 | orchestrator | 2025-09-27 03:15:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:25.112354 | orchestrator | 2025-09-27 03:15:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:25.112389 | orchestrator | 2025-09-27 03:15:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:28.159901 | orchestrator | 2025-09-27 03:15:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:28.163895 | orchestrator | 2025-09-27 03:15:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:28.164363 | orchestrator | 2025-09-27 03:15:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:31.216333 | orchestrator | 2025-09-27 03:15:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:31.217133 | orchestrator | 2025-09-27 03:15:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:31.217308 | orchestrator | 2025-09-27 03:15:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:34.262792 | orchestrator | 2025-09-27 03:15:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:34.264329 | orchestrator | 2025-09-27 03:15:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:34.264503 | orchestrator | 2025-09-27 03:15:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:37.314638 | orchestrator | 2025-09-27 03:15:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:37.315263 | orchestrator | 2025-09-27 03:15:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:37.315412 | orchestrator | 2025-09-27 03:15:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:40.360954 | orchestrator | 2025-09-27 03:15:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:40.362510 | orchestrator | 2025-09-27 03:15:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:40.362599 | orchestrator | 2025-09-27 03:15:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:43.407357 | orchestrator | 2025-09-27 03:15:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:43.408630 | orchestrator | 2025-09-27 03:15:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:43.408836 | orchestrator | 2025-09-27 03:15:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:46.451110 | orchestrator | 2025-09-27 03:15:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:46.453308 | orchestrator | 2025-09-27 03:15:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:46.453389 | orchestrator | 2025-09-27 03:15:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:49.499641 | orchestrator | 2025-09-27 03:15:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:49.500106 | orchestrator | 2025-09-27 03:15:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:49.500132 | orchestrator | 2025-09-27 03:15:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:52.551557 | orchestrator | 2025-09-27 03:15:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:52.552532 | orchestrator | 2025-09-27 03:15:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:52.552567 | orchestrator | 2025-09-27 03:15:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:55.598484 | orchestrator | 2025-09-27 03:15:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:55.601011 | orchestrator | 2025-09-27 03:15:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:55.601621 | orchestrator | 2025-09-27 03:15:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:15:58.646921 | orchestrator | 2025-09-27 03:15:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:15:58.648162 | orchestrator | 2025-09-27 03:15:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:15:58.648191 | orchestrator | 2025-09-27 03:15:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:01.691221 | orchestrator | 2025-09-27 03:16:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:01.693678 | orchestrator | 2025-09-27 03:16:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:01.693734 | orchestrator | 2025-09-27 03:16:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:04.738732 | orchestrator | 2025-09-27 03:16:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:04.739307 | orchestrator | 2025-09-27 03:16:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:04.739364 | orchestrator | 2025-09-27 03:16:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:07.796165 | orchestrator | 2025-09-27 03:16:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:07.798328 | orchestrator | 2025-09-27 03:16:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:07.798361 | orchestrator | 2025-09-27 03:16:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:10.845411 | orchestrator | 2025-09-27 03:16:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:10.846469 | orchestrator | 2025-09-27 03:16:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:10.846734 | orchestrator | 2025-09-27 03:16:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:13.894345 | orchestrator | 2025-09-27 03:16:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:13.894445 | orchestrator | 2025-09-27 03:16:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:13.894460 | orchestrator | 2025-09-27 03:16:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:16.939380 | orchestrator | 2025-09-27 03:16:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:16.940229 | orchestrator | 2025-09-27 03:16:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:16.940245 | orchestrator | 2025-09-27 03:16:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:19.988232 | orchestrator | 2025-09-27 03:16:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:19.988719 | orchestrator | 2025-09-27 03:16:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:19.988745 | orchestrator | 2025-09-27 03:16:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:23.041118 | orchestrator | 2025-09-27 03:16:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:23.043902 | orchestrator | 2025-09-27 03:16:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:23.043926 | orchestrator | 2025-09-27 03:16:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:26.090505 | orchestrator | 2025-09-27 03:16:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:26.091636 | orchestrator | 2025-09-27 03:16:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:26.091732 | orchestrator | 2025-09-27 03:16:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:29.137386 | orchestrator | 2025-09-27 03:16:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:29.138938 | orchestrator | 2025-09-27 03:16:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:29.139168 | orchestrator | 2025-09-27 03:16:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:32.184021 | orchestrator | 2025-09-27 03:16:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:32.185300 | orchestrator | 2025-09-27 03:16:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:32.185332 | orchestrator | 2025-09-27 03:16:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:35.225230 | orchestrator | 2025-09-27 03:16:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:35.227148 | orchestrator | 2025-09-27 03:16:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:35.227285 | orchestrator | 2025-09-27 03:16:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:38.275599 | orchestrator | 2025-09-27 03:16:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:38.275802 | orchestrator | 2025-09-27 03:16:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:38.275881 | orchestrator | 2025-09-27 03:16:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:41.320040 | orchestrator | 2025-09-27 03:16:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:41.320280 | orchestrator | 2025-09-27 03:16:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:41.320418 | orchestrator | 2025-09-27 03:16:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:44.358603 | orchestrator | 2025-09-27 03:16:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:44.360000 | orchestrator | 2025-09-27 03:16:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:44.360211 | orchestrator | 2025-09-27 03:16:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:47.411705 | orchestrator | 2025-09-27 03:16:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:47.412920 | orchestrator | 2025-09-27 03:16:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:47.412949 | orchestrator | 2025-09-27 03:16:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:50.463513 | orchestrator | 2025-09-27 03:16:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:50.463618 | orchestrator | 2025-09-27 03:16:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:50.463632 | orchestrator | 2025-09-27 03:16:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:53.514879 | orchestrator | 2025-09-27 03:16:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:53.517019 | orchestrator | 2025-09-27 03:16:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:53.517048 | orchestrator | 2025-09-27 03:16:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:56.558350 | orchestrator | 2025-09-27 03:16:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:56.559581 | orchestrator | 2025-09-27 03:16:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:56.559611 | orchestrator | 2025-09-27 03:16:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:16:59.603481 | orchestrator | 2025-09-27 03:16:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:16:59.603909 | orchestrator | 2025-09-27 03:16:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:16:59.604364 | orchestrator | 2025-09-27 03:16:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:02.650362 | orchestrator | 2025-09-27 03:17:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:02.652232 | orchestrator | 2025-09-27 03:17:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:02.652261 | orchestrator | 2025-09-27 03:17:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:05.698696 | orchestrator | 2025-09-27 03:17:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:05.699579 | orchestrator | 2025-09-27 03:17:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:05.699610 | orchestrator | 2025-09-27 03:17:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:08.751353 | orchestrator | 2025-09-27 03:17:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:08.753588 | orchestrator | 2025-09-27 03:17:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:08.753869 | orchestrator | 2025-09-27 03:17:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:11.800379 | orchestrator | 2025-09-27 03:17:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:11.801687 | orchestrator | 2025-09-27 03:17:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:11.801714 | orchestrator | 2025-09-27 03:17:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:14.846246 | orchestrator | 2025-09-27 03:17:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:14.847869 | orchestrator | 2025-09-27 03:17:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:14.847904 | orchestrator | 2025-09-27 03:17:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:17.893723 | orchestrator | 2025-09-27 03:17:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:17.895215 | orchestrator | 2025-09-27 03:17:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:17.895273 | orchestrator | 2025-09-27 03:17:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:20.937695 | orchestrator | 2025-09-27 03:17:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:20.938224 | orchestrator | 2025-09-27 03:17:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:20.938255 | orchestrator | 2025-09-27 03:17:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:23.979808 | orchestrator | 2025-09-27 03:17:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:23.980979 | orchestrator | 2025-09-27 03:17:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:23.981025 | orchestrator | 2025-09-27 03:17:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:27.021576 | orchestrator | 2025-09-27 03:17:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:27.023241 | orchestrator | 2025-09-27 03:17:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:27.023418 | orchestrator | 2025-09-27 03:17:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:30.066620 | orchestrator | 2025-09-27 03:17:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:30.070258 | orchestrator | 2025-09-27 03:17:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:30.070289 | orchestrator | 2025-09-27 03:17:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:33.112092 | orchestrator | 2025-09-27 03:17:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:33.113399 | orchestrator | 2025-09-27 03:17:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:33.113510 | orchestrator | 2025-09-27 03:17:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:36.154095 | orchestrator | 2025-09-27 03:17:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:36.155353 | orchestrator | 2025-09-27 03:17:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:36.155382 | orchestrator | 2025-09-27 03:17:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:39.200902 | orchestrator | 2025-09-27 03:17:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:39.201077 | orchestrator | 2025-09-27 03:17:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:39.201097 | orchestrator | 2025-09-27 03:17:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:42.237842 | orchestrator | 2025-09-27 03:17:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:42.239156 | orchestrator | 2025-09-27 03:17:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:42.239187 | orchestrator | 2025-09-27 03:17:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:45.279533 | orchestrator | 2025-09-27 03:17:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:45.280100 | orchestrator | 2025-09-27 03:17:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:45.280165 | orchestrator | 2025-09-27 03:17:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:48.321357 | orchestrator | 2025-09-27 03:17:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:48.322232 | orchestrator | 2025-09-27 03:17:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:48.322263 | orchestrator | 2025-09-27 03:17:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:51.368193 | orchestrator | 2025-09-27 03:17:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:51.368316 | orchestrator | 2025-09-27 03:17:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:51.368330 | orchestrator | 2025-09-27 03:17:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:54.408372 | orchestrator | 2025-09-27 03:17:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:54.409445 | orchestrator | 2025-09-27 03:17:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:54.409498 | orchestrator | 2025-09-27 03:17:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:17:57.452392 | orchestrator | 2025-09-27 03:17:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:17:57.458433 | orchestrator | 2025-09-27 03:17:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:17:57.458979 | orchestrator | 2025-09-27 03:17:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:00.508290 | orchestrator | 2025-09-27 03:18:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:00.508512 | orchestrator | 2025-09-27 03:18:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:00.508534 | orchestrator | 2025-09-27 03:18:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:03.552884 | orchestrator | 2025-09-27 03:18:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:03.553957 | orchestrator | 2025-09-27 03:18:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:03.553987 | orchestrator | 2025-09-27 03:18:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:06.599341 | orchestrator | 2025-09-27 03:18:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:06.600150 | orchestrator | 2025-09-27 03:18:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:06.600381 | orchestrator | 2025-09-27 03:18:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:09.648617 | orchestrator | 2025-09-27 03:18:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:09.649218 | orchestrator | 2025-09-27 03:18:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:09.649251 | orchestrator | 2025-09-27 03:18:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:12.694673 | orchestrator | 2025-09-27 03:18:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:12.696207 | orchestrator | 2025-09-27 03:18:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:12.696249 | orchestrator | 2025-09-27 03:18:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:15.739532 | orchestrator | 2025-09-27 03:18:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:15.740976 | orchestrator | 2025-09-27 03:18:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:15.741006 | orchestrator | 2025-09-27 03:18:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:18.783022 | orchestrator | 2025-09-27 03:18:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:18.784166 | orchestrator | 2025-09-27 03:18:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:18.784195 | orchestrator | 2025-09-27 03:18:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:21.833471 | orchestrator | 2025-09-27 03:18:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:21.833877 | orchestrator | 2025-09-27 03:18:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:21.833905 | orchestrator | 2025-09-27 03:18:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:24.880169 | orchestrator | 2025-09-27 03:18:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:24.881305 | orchestrator | 2025-09-27 03:18:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:24.881711 | orchestrator | 2025-09-27 03:18:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:27.924701 | orchestrator | 2025-09-27 03:18:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:27.925668 | orchestrator | 2025-09-27 03:18:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:27.926015 | orchestrator | 2025-09-27 03:18:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:30.966327 | orchestrator | 2025-09-27 03:18:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:30.968265 | orchestrator | 2025-09-27 03:18:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:30.968294 | orchestrator | 2025-09-27 03:18:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:34.013695 | orchestrator | 2025-09-27 03:18:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:34.014552 | orchestrator | 2025-09-27 03:18:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:34.014577 | orchestrator | 2025-09-27 03:18:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:37.060808 | orchestrator | 2025-09-27 03:18:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:37.061219 | orchestrator | 2025-09-27 03:18:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:37.061247 | orchestrator | 2025-09-27 03:18:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:40.101706 | orchestrator | 2025-09-27 03:18:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:40.103056 | orchestrator | 2025-09-27 03:18:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:40.103083 | orchestrator | 2025-09-27 03:18:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:43.142742 | orchestrator | 2025-09-27 03:18:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:43.143811 | orchestrator | 2025-09-27 03:18:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:43.144315 | orchestrator | 2025-09-27 03:18:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:46.188000 | orchestrator | 2025-09-27 03:18:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:46.189515 | orchestrator | 2025-09-27 03:18:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:46.189544 | orchestrator | 2025-09-27 03:18:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:49.242324 | orchestrator | 2025-09-27 03:18:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:49.243971 | orchestrator | 2025-09-27 03:18:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:49.244000 | orchestrator | 2025-09-27 03:18:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:52.289302 | orchestrator | 2025-09-27 03:18:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:52.290949 | orchestrator | 2025-09-27 03:18:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:52.290978 | orchestrator | 2025-09-27 03:18:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:55.341672 | orchestrator | 2025-09-27 03:18:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:55.341851 | orchestrator | 2025-09-27 03:18:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:55.341871 | orchestrator | 2025-09-27 03:18:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:18:58.389284 | orchestrator | 2025-09-27 03:18:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:18:58.389959 | orchestrator | 2025-09-27 03:18:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:18:58.390260 | orchestrator | 2025-09-27 03:18:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:01.434785 | orchestrator | 2025-09-27 03:19:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:01.436552 | orchestrator | 2025-09-27 03:19:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:01.436623 | orchestrator | 2025-09-27 03:19:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:04.482847 | orchestrator | 2025-09-27 03:19:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:04.484108 | orchestrator | 2025-09-27 03:19:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:04.484172 | orchestrator | 2025-09-27 03:19:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:07.529405 | orchestrator | 2025-09-27 03:19:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:07.529866 | orchestrator | 2025-09-27 03:19:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:07.529945 | orchestrator | 2025-09-27 03:19:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:10.589298 | orchestrator | 2025-09-27 03:19:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:10.590456 | orchestrator | 2025-09-27 03:19:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:10.590492 | orchestrator | 2025-09-27 03:19:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:13.633846 | orchestrator | 2025-09-27 03:19:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:13.635479 | orchestrator | 2025-09-27 03:19:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:13.635671 | orchestrator | 2025-09-27 03:19:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:16.689580 | orchestrator | 2025-09-27 03:19:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:16.691443 | orchestrator | 2025-09-27 03:19:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:16.691697 | orchestrator | 2025-09-27 03:19:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:19.743703 | orchestrator | 2025-09-27 03:19:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:19.745404 | orchestrator | 2025-09-27 03:19:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:19.746244 | orchestrator | 2025-09-27 03:19:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:22.798638 | orchestrator | 2025-09-27 03:19:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:22.799392 | orchestrator | 2025-09-27 03:19:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:22.799664 | orchestrator | 2025-09-27 03:19:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:25.848974 | orchestrator | 2025-09-27 03:19:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:25.851467 | orchestrator | 2025-09-27 03:19:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:25.851500 | orchestrator | 2025-09-27 03:19:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:28.914581 | orchestrator | 2025-09-27 03:19:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:28.914785 | orchestrator | 2025-09-27 03:19:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:28.915112 | orchestrator | 2025-09-27 03:19:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:31.978326 | orchestrator | 2025-09-27 03:19:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:31.979741 | orchestrator | 2025-09-27 03:19:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:31.979779 | orchestrator | 2025-09-27 03:19:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:35.037311 | orchestrator | 2025-09-27 03:19:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:35.040507 | orchestrator | 2025-09-27 03:19:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:35.040570 | orchestrator | 2025-09-27 03:19:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:38.090533 | orchestrator | 2025-09-27 03:19:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:38.093675 | orchestrator | 2025-09-27 03:19:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:38.093772 | orchestrator | 2025-09-27 03:19:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:41.144346 | orchestrator | 2025-09-27 03:19:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:41.148259 | orchestrator | 2025-09-27 03:19:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:41.148349 | orchestrator | 2025-09-27 03:19:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:44.200485 | orchestrator | 2025-09-27 03:19:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:44.200695 | orchestrator | 2025-09-27 03:19:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:44.200717 | orchestrator | 2025-09-27 03:19:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:47.251237 | orchestrator | 2025-09-27 03:19:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:47.252823 | orchestrator | 2025-09-27 03:19:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:47.252849 | orchestrator | 2025-09-27 03:19:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:50.304351 | orchestrator | 2025-09-27 03:19:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:50.304755 | orchestrator | 2025-09-27 03:19:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:50.304782 | orchestrator | 2025-09-27 03:19:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:53.348655 | orchestrator | 2025-09-27 03:19:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:53.348957 | orchestrator | 2025-09-27 03:19:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:53.348983 | orchestrator | 2025-09-27 03:19:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:56.390597 | orchestrator | 2025-09-27 03:19:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:56.391274 | orchestrator | 2025-09-27 03:19:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:56.391303 | orchestrator | 2025-09-27 03:19:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:19:59.440242 | orchestrator | 2025-09-27 03:19:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:19:59.440811 | orchestrator | 2025-09-27 03:19:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:19:59.440837 | orchestrator | 2025-09-27 03:19:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:02.491573 | orchestrator | 2025-09-27 03:20:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:02.492887 | orchestrator | 2025-09-27 03:20:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:02.492915 | orchestrator | 2025-09-27 03:20:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:05.538449 | orchestrator | 2025-09-27 03:20:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:05.538552 | orchestrator | 2025-09-27 03:20:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:05.538567 | orchestrator | 2025-09-27 03:20:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:08.587647 | orchestrator | 2025-09-27 03:20:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:08.588977 | orchestrator | 2025-09-27 03:20:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:08.589005 | orchestrator | 2025-09-27 03:20:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:11.635429 | orchestrator | 2025-09-27 03:20:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:11.636787 | orchestrator | 2025-09-27 03:20:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:11.636824 | orchestrator | 2025-09-27 03:20:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:14.684071 | orchestrator | 2025-09-27 03:20:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:14.684608 | orchestrator | 2025-09-27 03:20:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:14.684826 | orchestrator | 2025-09-27 03:20:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:17.721689 | orchestrator | 2025-09-27 03:20:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:17.723147 | orchestrator | 2025-09-27 03:20:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:17.723179 | orchestrator | 2025-09-27 03:20:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:20.771341 | orchestrator | 2025-09-27 03:20:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:20.771773 | orchestrator | 2025-09-27 03:20:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:20.772122 | orchestrator | 2025-09-27 03:20:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:23.821797 | orchestrator | 2025-09-27 03:20:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:23.824837 | orchestrator | 2025-09-27 03:20:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:23.825134 | orchestrator | 2025-09-27 03:20:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:26.868454 | orchestrator | 2025-09-27 03:20:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:26.870411 | orchestrator | 2025-09-27 03:20:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:26.871122 | orchestrator | 2025-09-27 03:20:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:29.920878 | orchestrator | 2025-09-27 03:20:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:29.924391 | orchestrator | 2025-09-27 03:20:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:29.924434 | orchestrator | 2025-09-27 03:20:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:32.969483 | orchestrator | 2025-09-27 03:20:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:32.970417 | orchestrator | 2025-09-27 03:20:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:32.970448 | orchestrator | 2025-09-27 03:20:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:36.017146 | orchestrator | 2025-09-27 03:20:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:36.018517 | orchestrator | 2025-09-27 03:20:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:36.018545 | orchestrator | 2025-09-27 03:20:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:39.066430 | orchestrator | 2025-09-27 03:20:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:39.067312 | orchestrator | 2025-09-27 03:20:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:39.067340 | orchestrator | 2025-09-27 03:20:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:42.112878 | orchestrator | 2025-09-27 03:20:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:42.114395 | orchestrator | 2025-09-27 03:20:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:42.114425 | orchestrator | 2025-09-27 03:20:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:45.166416 | orchestrator | 2025-09-27 03:20:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:45.167544 | orchestrator | 2025-09-27 03:20:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:45.167612 | orchestrator | 2025-09-27 03:20:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:48.218157 | orchestrator | 2025-09-27 03:20:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:48.218755 | orchestrator | 2025-09-27 03:20:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:48.218837 | orchestrator | 2025-09-27 03:20:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:51.269000 | orchestrator | 2025-09-27 03:20:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:51.269972 | orchestrator | 2025-09-27 03:20:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:51.270000 | orchestrator | 2025-09-27 03:20:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:54.320701 | orchestrator | 2025-09-27 03:20:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:54.322220 | orchestrator | 2025-09-27 03:20:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:54.322292 | orchestrator | 2025-09-27 03:20:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:20:57.365630 | orchestrator | 2025-09-27 03:20:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:20:57.371177 | orchestrator | 2025-09-27 03:20:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:20:57.371224 | orchestrator | 2025-09-27 03:20:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:00.412039 | orchestrator | 2025-09-27 03:21:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:00.412908 | orchestrator | 2025-09-27 03:21:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:00.413158 | orchestrator | 2025-09-27 03:21:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:03.463535 | orchestrator | 2025-09-27 03:21:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:03.465093 | orchestrator | 2025-09-27 03:21:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:03.465127 | orchestrator | 2025-09-27 03:21:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:06.516408 | orchestrator | 2025-09-27 03:21:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:06.519017 | orchestrator | 2025-09-27 03:21:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:06.519046 | orchestrator | 2025-09-27 03:21:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:09.568022 | orchestrator | 2025-09-27 03:21:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:09.569017 | orchestrator | 2025-09-27 03:21:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:09.569093 | orchestrator | 2025-09-27 03:21:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:12.613590 | orchestrator | 2025-09-27 03:21:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:12.614807 | orchestrator | 2025-09-27 03:21:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:12.614982 | orchestrator | 2025-09-27 03:21:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:15.658224 | orchestrator | 2025-09-27 03:21:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:15.659855 | orchestrator | 2025-09-27 03:21:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:15.659890 | orchestrator | 2025-09-27 03:21:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:18.698316 | orchestrator | 2025-09-27 03:21:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:18.699595 | orchestrator | 2025-09-27 03:21:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:18.699837 | orchestrator | 2025-09-27 03:21:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:21.748373 | orchestrator | 2025-09-27 03:21:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:21.749831 | orchestrator | 2025-09-27 03:21:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:21.750444 | orchestrator | 2025-09-27 03:21:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:24.798852 | orchestrator | 2025-09-27 03:21:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:24.800434 | orchestrator | 2025-09-27 03:21:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:24.800468 | orchestrator | 2025-09-27 03:21:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:27.846638 | orchestrator | 2025-09-27 03:21:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:27.848817 | orchestrator | 2025-09-27 03:21:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:27.849022 | orchestrator | 2025-09-27 03:21:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:30.900638 | orchestrator | 2025-09-27 03:21:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:30.902277 | orchestrator | 2025-09-27 03:21:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:30.902438 | orchestrator | 2025-09-27 03:21:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:33.945665 | orchestrator | 2025-09-27 03:21:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:33.947325 | orchestrator | 2025-09-27 03:21:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:33.947410 | orchestrator | 2025-09-27 03:21:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:36.995214 | orchestrator | 2025-09-27 03:21:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:36.998269 | orchestrator | 2025-09-27 03:21:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:36.998315 | orchestrator | 2025-09-27 03:21:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:40.042806 | orchestrator | 2025-09-27 03:21:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:40.043587 | orchestrator | 2025-09-27 03:21:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:40.043724 | orchestrator | 2025-09-27 03:21:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:43.084623 | orchestrator | 2025-09-27 03:21:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:43.086195 | orchestrator | 2025-09-27 03:21:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:43.086527 | orchestrator | 2025-09-27 03:21:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:46.137590 | orchestrator | 2025-09-27 03:21:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:46.138906 | orchestrator | 2025-09-27 03:21:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:46.138922 | orchestrator | 2025-09-27 03:21:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:49.184778 | orchestrator | 2025-09-27 03:21:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:49.187240 | orchestrator | 2025-09-27 03:21:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:49.187272 | orchestrator | 2025-09-27 03:21:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:52.238346 | orchestrator | 2025-09-27 03:21:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:52.242164 | orchestrator | 2025-09-27 03:21:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:52.242238 | orchestrator | 2025-09-27 03:21:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:55.289704 | orchestrator | 2025-09-27 03:21:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:55.291669 | orchestrator | 2025-09-27 03:21:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:55.292043 | orchestrator | 2025-09-27 03:21:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:21:58.344496 | orchestrator | 2025-09-27 03:21:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:21:58.345810 | orchestrator | 2025-09-27 03:21:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:21:58.346002 | orchestrator | 2025-09-27 03:21:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:01.397842 | orchestrator | 2025-09-27 03:22:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:01.402843 | orchestrator | 2025-09-27 03:22:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:01.403247 | orchestrator | 2025-09-27 03:22:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:04.452721 | orchestrator | 2025-09-27 03:22:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:04.453982 | orchestrator | 2025-09-27 03:22:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:04.454086 | orchestrator | 2025-09-27 03:22:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:07.510159 | orchestrator | 2025-09-27 03:22:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:07.512513 | orchestrator | 2025-09-27 03:22:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:07.512544 | orchestrator | 2025-09-27 03:22:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:10.560196 | orchestrator | 2025-09-27 03:22:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:10.561314 | orchestrator | 2025-09-27 03:22:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:10.561346 | orchestrator | 2025-09-27 03:22:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:13.612652 | orchestrator | 2025-09-27 03:22:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:13.613183 | orchestrator | 2025-09-27 03:22:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:13.613267 | orchestrator | 2025-09-27 03:22:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:16.662508 | orchestrator | 2025-09-27 03:22:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:16.664215 | orchestrator | 2025-09-27 03:22:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:16.664309 | orchestrator | 2025-09-27 03:22:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:19.708353 | orchestrator | 2025-09-27 03:22:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:19.709533 | orchestrator | 2025-09-27 03:22:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:19.709566 | orchestrator | 2025-09-27 03:22:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:22.755635 | orchestrator | 2025-09-27 03:22:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:22.757473 | orchestrator | 2025-09-27 03:22:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:22.757504 | orchestrator | 2025-09-27 03:22:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:25.805407 | orchestrator | 2025-09-27 03:22:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:25.806957 | orchestrator | 2025-09-27 03:22:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:25.806984 | orchestrator | 2025-09-27 03:22:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:28.853626 | orchestrator | 2025-09-27 03:22:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:28.854996 | orchestrator | 2025-09-27 03:22:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:28.855095 | orchestrator | 2025-09-27 03:22:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:31.902640 | orchestrator | 2025-09-27 03:22:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:31.903328 | orchestrator | 2025-09-27 03:22:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:31.903361 | orchestrator | 2025-09-27 03:22:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:34.946427 | orchestrator | 2025-09-27 03:22:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:34.947481 | orchestrator | 2025-09-27 03:22:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:34.947515 | orchestrator | 2025-09-27 03:22:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:37.993210 | orchestrator | 2025-09-27 03:22:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:37.995614 | orchestrator | 2025-09-27 03:22:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:37.995652 | orchestrator | 2025-09-27 03:22:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:41.046909 | orchestrator | 2025-09-27 03:22:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:41.048750 | orchestrator | 2025-09-27 03:22:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:41.048783 | orchestrator | 2025-09-27 03:22:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:44.092390 | orchestrator | 2025-09-27 03:22:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:44.093554 | orchestrator | 2025-09-27 03:22:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:44.093583 | orchestrator | 2025-09-27 03:22:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:47.139510 | orchestrator | 2025-09-27 03:22:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:47.141532 | orchestrator | 2025-09-27 03:22:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:47.141858 | orchestrator | 2025-09-27 03:22:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:50.193907 | orchestrator | 2025-09-27 03:22:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:50.196165 | orchestrator | 2025-09-27 03:22:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:50.196347 | orchestrator | 2025-09-27 03:22:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:53.248577 | orchestrator | 2025-09-27 03:22:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:53.249743 | orchestrator | 2025-09-27 03:22:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:53.249776 | orchestrator | 2025-09-27 03:22:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:56.295540 | orchestrator | 2025-09-27 03:22:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:56.297146 | orchestrator | 2025-09-27 03:22:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:56.297216 | orchestrator | 2025-09-27 03:22:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:22:59.341198 | orchestrator | 2025-09-27 03:22:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:22:59.342368 | orchestrator | 2025-09-27 03:22:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:22:59.342578 | orchestrator | 2025-09-27 03:22:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:02.391973 | orchestrator | 2025-09-27 03:23:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:02.394188 | orchestrator | 2025-09-27 03:23:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:02.394224 | orchestrator | 2025-09-27 03:23:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:05.437115 | orchestrator | 2025-09-27 03:23:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:05.438958 | orchestrator | 2025-09-27 03:23:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:05.439220 | orchestrator | 2025-09-27 03:23:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:08.485109 | orchestrator | 2025-09-27 03:23:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:08.486188 | orchestrator | 2025-09-27 03:23:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:08.486354 | orchestrator | 2025-09-27 03:23:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:11.530869 | orchestrator | 2025-09-27 03:23:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:11.532211 | orchestrator | 2025-09-27 03:23:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:11.532244 | orchestrator | 2025-09-27 03:23:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:14.577577 | orchestrator | 2025-09-27 03:23:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:14.579810 | orchestrator | 2025-09-27 03:23:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:14.579955 | orchestrator | 2025-09-27 03:23:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:17.624685 | orchestrator | 2025-09-27 03:23:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:17.626347 | orchestrator | 2025-09-27 03:23:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:17.626439 | orchestrator | 2025-09-27 03:23:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:20.676755 | orchestrator | 2025-09-27 03:23:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:20.678610 | orchestrator | 2025-09-27 03:23:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:20.679019 | orchestrator | 2025-09-27 03:23:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:23.720850 | orchestrator | 2025-09-27 03:23:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:23.722257 | orchestrator | 2025-09-27 03:23:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:23.722352 | orchestrator | 2025-09-27 03:23:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:26.765333 | orchestrator | 2025-09-27 03:23:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:26.767254 | orchestrator | 2025-09-27 03:23:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:26.767300 | orchestrator | 2025-09-27 03:23:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:29.823739 | orchestrator | 2025-09-27 03:23:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:29.825200 | orchestrator | 2025-09-27 03:23:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:29.825229 | orchestrator | 2025-09-27 03:23:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:32.872110 | orchestrator | 2025-09-27 03:23:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:32.873313 | orchestrator | 2025-09-27 03:23:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:32.873343 | orchestrator | 2025-09-27 03:23:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:35.924410 | orchestrator | 2025-09-27 03:23:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:35.927204 | orchestrator | 2025-09-27 03:23:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:35.927360 | orchestrator | 2025-09-27 03:23:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:38.977614 | orchestrator | 2025-09-27 03:23:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:38.979428 | orchestrator | 2025-09-27 03:23:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:38.980550 | orchestrator | 2025-09-27 03:23:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:42.022961 | orchestrator | 2025-09-27 03:23:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:42.025398 | orchestrator | 2025-09-27 03:23:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:42.025460 | orchestrator | 2025-09-27 03:23:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:45.069583 | orchestrator | 2025-09-27 03:23:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:45.071898 | orchestrator | 2025-09-27 03:23:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:45.071928 | orchestrator | 2025-09-27 03:23:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:48.116524 | orchestrator | 2025-09-27 03:23:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:48.118533 | orchestrator | 2025-09-27 03:23:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:48.118578 | orchestrator | 2025-09-27 03:23:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:51.163729 | orchestrator | 2025-09-27 03:23:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:51.165071 | orchestrator | 2025-09-27 03:23:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:51.165157 | orchestrator | 2025-09-27 03:23:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:54.217634 | orchestrator | 2025-09-27 03:23:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:54.218373 | orchestrator | 2025-09-27 03:23:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:54.218406 | orchestrator | 2025-09-27 03:23:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:23:57.269502 | orchestrator | 2025-09-27 03:23:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:23:57.271167 | orchestrator | 2025-09-27 03:23:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:23:57.271233 | orchestrator | 2025-09-27 03:23:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:00.317690 | orchestrator | 2025-09-27 03:24:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:00.317872 | orchestrator | 2025-09-27 03:24:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:00.317893 | orchestrator | 2025-09-27 03:24:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:03.371077 | orchestrator | 2025-09-27 03:24:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:03.373923 | orchestrator | 2025-09-27 03:24:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:03.374408 | orchestrator | 2025-09-27 03:24:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:06.414740 | orchestrator | 2025-09-27 03:24:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:06.415556 | orchestrator | 2025-09-27 03:24:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:06.415725 | orchestrator | 2025-09-27 03:24:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:09.464234 | orchestrator | 2025-09-27 03:24:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:09.466189 | orchestrator | 2025-09-27 03:24:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:09.466221 | orchestrator | 2025-09-27 03:24:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:12.517209 | orchestrator | 2025-09-27 03:24:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:12.518299 | orchestrator | 2025-09-27 03:24:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:12.518390 | orchestrator | 2025-09-27 03:24:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:15.564594 | orchestrator | 2025-09-27 03:24:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:15.567204 | orchestrator | 2025-09-27 03:24:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:15.567235 | orchestrator | 2025-09-27 03:24:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:18.612789 | orchestrator | 2025-09-27 03:24:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:18.615138 | orchestrator | 2025-09-27 03:24:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:18.615176 | orchestrator | 2025-09-27 03:24:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:21.657427 | orchestrator | 2025-09-27 03:24:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:21.658540 | orchestrator | 2025-09-27 03:24:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:21.658575 | orchestrator | 2025-09-27 03:24:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:24.705474 | orchestrator | 2025-09-27 03:24:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:24.706933 | orchestrator | 2025-09-27 03:24:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:24.706970 | orchestrator | 2025-09-27 03:24:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:27.750807 | orchestrator | 2025-09-27 03:24:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:27.753537 | orchestrator | 2025-09-27 03:24:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:27.753568 | orchestrator | 2025-09-27 03:24:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:30.797814 | orchestrator | 2025-09-27 03:24:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:30.799484 | orchestrator | 2025-09-27 03:24:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:30.799637 | orchestrator | 2025-09-27 03:24:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:33.845407 | orchestrator | 2025-09-27 03:24:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:33.847093 | orchestrator | 2025-09-27 03:24:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:33.847127 | orchestrator | 2025-09-27 03:24:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:36.891727 | orchestrator | 2025-09-27 03:24:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:36.892417 | orchestrator | 2025-09-27 03:24:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:36.892448 | orchestrator | 2025-09-27 03:24:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:39.939094 | orchestrator | 2025-09-27 03:24:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:39.939435 | orchestrator | 2025-09-27 03:24:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:39.939462 | orchestrator | 2025-09-27 03:24:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:42.983262 | orchestrator | 2025-09-27 03:24:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:42.984805 | orchestrator | 2025-09-27 03:24:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:42.984864 | orchestrator | 2025-09-27 03:24:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:46.036117 | orchestrator | 2025-09-27 03:24:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:46.038360 | orchestrator | 2025-09-27 03:24:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:46.038397 | orchestrator | 2025-09-27 03:24:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:49.085625 | orchestrator | 2025-09-27 03:24:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:49.088403 | orchestrator | 2025-09-27 03:24:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:49.088460 | orchestrator | 2025-09-27 03:24:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:52.139090 | orchestrator | 2025-09-27 03:24:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:52.141493 | orchestrator | 2025-09-27 03:24:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:52.141609 | orchestrator | 2025-09-27 03:24:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:55.195584 | orchestrator | 2025-09-27 03:24:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:55.196769 | orchestrator | 2025-09-27 03:24:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:55.197261 | orchestrator | 2025-09-27 03:24:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:24:58.240585 | orchestrator | 2025-09-27 03:24:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:24:58.241991 | orchestrator | 2025-09-27 03:24:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:24:58.242192 | orchestrator | 2025-09-27 03:24:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:01.286788 | orchestrator | 2025-09-27 03:25:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:01.288207 | orchestrator | 2025-09-27 03:25:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:01.288252 | orchestrator | 2025-09-27 03:25:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:04.337766 | orchestrator | 2025-09-27 03:25:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:04.339732 | orchestrator | 2025-09-27 03:25:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:04.339816 | orchestrator | 2025-09-27 03:25:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:07.382273 | orchestrator | 2025-09-27 03:25:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:07.383974 | orchestrator | 2025-09-27 03:25:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:07.384036 | orchestrator | 2025-09-27 03:25:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:10.426125 | orchestrator | 2025-09-27 03:25:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:10.426936 | orchestrator | 2025-09-27 03:25:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:10.426967 | orchestrator | 2025-09-27 03:25:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:13.474968 | orchestrator | 2025-09-27 03:25:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:13.477215 | orchestrator | 2025-09-27 03:25:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:13.477289 | orchestrator | 2025-09-27 03:25:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:16.523289 | orchestrator | 2025-09-27 03:25:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:16.524645 | orchestrator | 2025-09-27 03:25:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:16.524675 | orchestrator | 2025-09-27 03:25:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:19.573267 | orchestrator | 2025-09-27 03:25:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:19.574453 | orchestrator | 2025-09-27 03:25:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:19.574484 | orchestrator | 2025-09-27 03:25:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:22.620551 | orchestrator | 2025-09-27 03:25:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:22.621862 | orchestrator | 2025-09-27 03:25:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:22.622065 | orchestrator | 2025-09-27 03:25:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:25.670147 | orchestrator | 2025-09-27 03:25:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:25.671465 | orchestrator | 2025-09-27 03:25:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:25.671535 | orchestrator | 2025-09-27 03:25:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:28.711881 | orchestrator | 2025-09-27 03:25:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:28.713319 | orchestrator | 2025-09-27 03:25:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:28.713354 | orchestrator | 2025-09-27 03:25:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:31.761532 | orchestrator | 2025-09-27 03:25:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:31.762703 | orchestrator | 2025-09-27 03:25:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:31.762735 | orchestrator | 2025-09-27 03:25:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:34.807535 | orchestrator | 2025-09-27 03:25:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:34.809382 | orchestrator | 2025-09-27 03:25:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:34.809864 | orchestrator | 2025-09-27 03:25:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:37.855249 | orchestrator | 2025-09-27 03:25:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:37.856788 | orchestrator | 2025-09-27 03:25:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:37.856816 | orchestrator | 2025-09-27 03:25:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:40.903881 | orchestrator | 2025-09-27 03:25:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:40.906459 | orchestrator | 2025-09-27 03:25:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:40.906541 | orchestrator | 2025-09-27 03:25:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:43.949908 | orchestrator | 2025-09-27 03:25:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:43.951372 | orchestrator | 2025-09-27 03:25:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:43.951843 | orchestrator | 2025-09-27 03:25:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:46.995096 | orchestrator | 2025-09-27 03:25:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:46.996086 | orchestrator | 2025-09-27 03:25:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:46.996601 | orchestrator | 2025-09-27 03:25:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:50.035435 | orchestrator | 2025-09-27 03:25:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:50.036921 | orchestrator | 2025-09-27 03:25:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:50.037249 | orchestrator | 2025-09-27 03:25:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:53.080931 | orchestrator | 2025-09-27 03:25:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:53.083172 | orchestrator | 2025-09-27 03:25:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:53.083209 | orchestrator | 2025-09-27 03:25:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:56.135882 | orchestrator | 2025-09-27 03:25:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:56.137670 | orchestrator | 2025-09-27 03:25:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:56.137754 | orchestrator | 2025-09-27 03:25:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:25:59.184077 | orchestrator | 2025-09-27 03:25:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:25:59.187838 | orchestrator | 2025-09-27 03:25:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:25:59.187878 | orchestrator | 2025-09-27 03:25:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:02.235261 | orchestrator | 2025-09-27 03:26:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:02.237055 | orchestrator | 2025-09-27 03:26:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:02.237093 | orchestrator | 2025-09-27 03:26:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:05.280504 | orchestrator | 2025-09-27 03:26:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:05.282739 | orchestrator | 2025-09-27 03:26:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:05.282771 | orchestrator | 2025-09-27 03:26:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:08.329787 | orchestrator | 2025-09-27 03:26:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:08.331048 | orchestrator | 2025-09-27 03:26:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:08.331143 | orchestrator | 2025-09-27 03:26:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:11.376427 | orchestrator | 2025-09-27 03:26:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:11.377442 | orchestrator | 2025-09-27 03:26:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:11.377476 | orchestrator | 2025-09-27 03:26:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:14.419537 | orchestrator | 2025-09-27 03:26:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:14.420309 | orchestrator | 2025-09-27 03:26:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:14.420344 | orchestrator | 2025-09-27 03:26:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:17.466154 | orchestrator | 2025-09-27 03:26:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:17.467725 | orchestrator | 2025-09-27 03:26:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:17.467840 | orchestrator | 2025-09-27 03:26:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:20.512356 | orchestrator | 2025-09-27 03:26:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:20.513495 | orchestrator | 2025-09-27 03:26:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:20.513523 | orchestrator | 2025-09-27 03:26:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:23.553041 | orchestrator | 2025-09-27 03:26:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:23.555110 | orchestrator | 2025-09-27 03:26:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:23.555143 | orchestrator | 2025-09-27 03:26:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:26.596337 | orchestrator | 2025-09-27 03:26:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:26.596734 | orchestrator | 2025-09-27 03:26:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:26.596764 | orchestrator | 2025-09-27 03:26:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:29.646913 | orchestrator | 2025-09-27 03:26:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:29.647657 | orchestrator | 2025-09-27 03:26:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:29.647748 | orchestrator | 2025-09-27 03:26:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:32.697567 | orchestrator | 2025-09-27 03:26:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:32.699410 | orchestrator | 2025-09-27 03:26:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:32.699443 | orchestrator | 2025-09-27 03:26:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:35.745094 | orchestrator | 2025-09-27 03:26:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:35.746634 | orchestrator | 2025-09-27 03:26:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:35.746668 | orchestrator | 2025-09-27 03:26:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:38.794372 | orchestrator | 2025-09-27 03:26:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:38.795454 | orchestrator | 2025-09-27 03:26:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:38.795484 | orchestrator | 2025-09-27 03:26:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:41.841543 | orchestrator | 2025-09-27 03:26:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:41.843671 | orchestrator | 2025-09-27 03:26:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:41.843708 | orchestrator | 2025-09-27 03:26:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:44.889134 | orchestrator | 2025-09-27 03:26:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:44.891641 | orchestrator | 2025-09-27 03:26:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:44.891974 | orchestrator | 2025-09-27 03:26:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:47.932511 | orchestrator | 2025-09-27 03:26:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:47.934160 | orchestrator | 2025-09-27 03:26:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:47.934193 | orchestrator | 2025-09-27 03:26:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:50.976501 | orchestrator | 2025-09-27 03:26:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:50.978473 | orchestrator | 2025-09-27 03:26:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:50.978547 | orchestrator | 2025-09-27 03:26:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:54.021347 | orchestrator | 2025-09-27 03:26:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:54.021591 | orchestrator | 2025-09-27 03:26:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:54.021693 | orchestrator | 2025-09-27 03:26:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:26:57.060656 | orchestrator | 2025-09-27 03:26:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:26:57.062353 | orchestrator | 2025-09-27 03:26:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:26:57.062658 | orchestrator | 2025-09-27 03:26:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:00.109321 | orchestrator | 2025-09-27 03:27:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:00.111809 | orchestrator | 2025-09-27 03:27:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:00.111839 | orchestrator | 2025-09-27 03:27:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:03.155583 | orchestrator | 2025-09-27 03:27:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:03.158007 | orchestrator | 2025-09-27 03:27:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:03.158110 | orchestrator | 2025-09-27 03:27:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:06.207558 | orchestrator | 2025-09-27 03:27:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:06.209341 | orchestrator | 2025-09-27 03:27:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:06.209371 | orchestrator | 2025-09-27 03:27:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:09.258279 | orchestrator | 2025-09-27 03:27:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:09.259876 | orchestrator | 2025-09-27 03:27:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:09.259903 | orchestrator | 2025-09-27 03:27:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:12.314165 | orchestrator | 2025-09-27 03:27:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:12.314717 | orchestrator | 2025-09-27 03:27:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:12.314876 | orchestrator | 2025-09-27 03:27:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:15.367476 | orchestrator | 2025-09-27 03:27:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:15.368853 | orchestrator | 2025-09-27 03:27:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:15.369158 | orchestrator | 2025-09-27 03:27:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:18.419766 | orchestrator | 2025-09-27 03:27:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:18.422149 | orchestrator | 2025-09-27 03:27:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:18.422183 | orchestrator | 2025-09-27 03:27:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:21.458306 | orchestrator | 2025-09-27 03:27:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:21.459348 | orchestrator | 2025-09-27 03:27:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:21.459383 | orchestrator | 2025-09-27 03:27:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:24.503329 | orchestrator | 2025-09-27 03:27:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:24.504897 | orchestrator | 2025-09-27 03:27:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:24.504930 | orchestrator | 2025-09-27 03:27:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:27.550528 | orchestrator | 2025-09-27 03:27:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:27.552360 | orchestrator | 2025-09-27 03:27:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:27.552389 | orchestrator | 2025-09-27 03:27:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:30.598877 | orchestrator | 2025-09-27 03:27:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:30.600755 | orchestrator | 2025-09-27 03:27:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:30.600782 | orchestrator | 2025-09-27 03:27:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:33.643054 | orchestrator | 2025-09-27 03:27:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:33.645368 | orchestrator | 2025-09-27 03:27:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:33.645396 | orchestrator | 2025-09-27 03:27:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:36.687789 | orchestrator | 2025-09-27 03:27:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:36.690438 | orchestrator | 2025-09-27 03:27:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:36.690475 | orchestrator | 2025-09-27 03:27:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:39.735673 | orchestrator | 2025-09-27 03:27:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:39.737336 | orchestrator | 2025-09-27 03:27:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:39.737359 | orchestrator | 2025-09-27 03:27:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:42.787920 | orchestrator | 2025-09-27 03:27:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:42.790741 | orchestrator | 2025-09-27 03:27:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:42.790772 | orchestrator | 2025-09-27 03:27:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:45.836399 | orchestrator | 2025-09-27 03:27:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:45.837708 | orchestrator | 2025-09-27 03:27:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:45.837790 | orchestrator | 2025-09-27 03:27:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:48.879864 | orchestrator | 2025-09-27 03:27:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:48.880483 | orchestrator | 2025-09-27 03:27:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:48.880514 | orchestrator | 2025-09-27 03:27:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:51.931434 | orchestrator | 2025-09-27 03:27:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:51.932688 | orchestrator | 2025-09-27 03:27:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:51.932717 | orchestrator | 2025-09-27 03:27:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:54.980756 | orchestrator | 2025-09-27 03:27:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:54.983431 | orchestrator | 2025-09-27 03:27:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:54.983469 | orchestrator | 2025-09-27 03:27:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:27:58.030403 | orchestrator | 2025-09-27 03:27:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:27:58.032513 | orchestrator | 2025-09-27 03:27:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:27:58.032547 | orchestrator | 2025-09-27 03:27:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:01.088875 | orchestrator | 2025-09-27 03:28:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:01.090220 | orchestrator | 2025-09-27 03:28:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:01.090289 | orchestrator | 2025-09-27 03:28:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:04.139288 | orchestrator | 2025-09-27 03:28:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:04.141214 | orchestrator | 2025-09-27 03:28:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:04.141477 | orchestrator | 2025-09-27 03:28:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:07.188602 | orchestrator | 2025-09-27 03:28:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:07.191061 | orchestrator | 2025-09-27 03:28:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:07.191091 | orchestrator | 2025-09-27 03:28:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:10.233875 | orchestrator | 2025-09-27 03:28:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:10.234696 | orchestrator | 2025-09-27 03:28:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:10.234726 | orchestrator | 2025-09-27 03:28:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:13.280077 | orchestrator | 2025-09-27 03:28:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:13.281363 | orchestrator | 2025-09-27 03:28:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:13.281667 | orchestrator | 2025-09-27 03:28:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:16.324264 | orchestrator | 2025-09-27 03:28:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:16.326594 | orchestrator | 2025-09-27 03:28:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:16.326622 | orchestrator | 2025-09-27 03:28:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:19.369502 | orchestrator | 2025-09-27 03:28:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:19.370624 | orchestrator | 2025-09-27 03:28:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:19.370657 | orchestrator | 2025-09-27 03:28:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:22.414998 | orchestrator | 2025-09-27 03:28:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:22.418089 | orchestrator | 2025-09-27 03:28:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:22.418155 | orchestrator | 2025-09-27 03:28:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:25.464017 | orchestrator | 2025-09-27 03:28:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:25.466272 | orchestrator | 2025-09-27 03:28:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:25.466302 | orchestrator | 2025-09-27 03:28:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:28.511004 | orchestrator | 2025-09-27 03:28:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:28.513340 | orchestrator | 2025-09-27 03:28:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:28.513615 | orchestrator | 2025-09-27 03:28:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:31.555055 | orchestrator | 2025-09-27 03:28:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:31.557102 | orchestrator | 2025-09-27 03:28:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:31.557141 | orchestrator | 2025-09-27 03:28:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:34.601071 | orchestrator | 2025-09-27 03:28:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:34.603247 | orchestrator | 2025-09-27 03:28:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:34.603701 | orchestrator | 2025-09-27 03:28:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:37.645844 | orchestrator | 2025-09-27 03:28:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:37.647187 | orchestrator | 2025-09-27 03:28:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:37.647331 | orchestrator | 2025-09-27 03:28:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:40.696416 | orchestrator | 2025-09-27 03:28:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:40.697661 | orchestrator | 2025-09-27 03:28:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:40.697869 | orchestrator | 2025-09-27 03:28:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:43.745135 | orchestrator | 2025-09-27 03:28:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:43.746421 | orchestrator | 2025-09-27 03:28:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:43.746607 | orchestrator | 2025-09-27 03:28:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:46.795609 | orchestrator | 2025-09-27 03:28:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:46.798117 | orchestrator | 2025-09-27 03:28:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:46.798154 | orchestrator | 2025-09-27 03:28:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:49.847609 | orchestrator | 2025-09-27 03:28:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:49.847716 | orchestrator | 2025-09-27 03:28:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:49.847732 | orchestrator | 2025-09-27 03:28:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:52.894438 | orchestrator | 2025-09-27 03:28:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:52.894614 | orchestrator | 2025-09-27 03:28:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:52.894752 | orchestrator | 2025-09-27 03:28:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:55.935761 | orchestrator | 2025-09-27 03:28:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:55.937236 | orchestrator | 2025-09-27 03:28:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:55.937438 | orchestrator | 2025-09-27 03:28:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:28:58.989242 | orchestrator | 2025-09-27 03:28:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:28:58.990929 | orchestrator | 2025-09-27 03:28:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:28:58.991189 | orchestrator | 2025-09-27 03:28:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:02.040373 | orchestrator | 2025-09-27 03:29:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:02.041474 | orchestrator | 2025-09-27 03:29:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:02.042780 | orchestrator | 2025-09-27 03:29:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:05.086928 | orchestrator | 2025-09-27 03:29:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:05.089110 | orchestrator | 2025-09-27 03:29:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:05.089159 | orchestrator | 2025-09-27 03:29:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:08.138421 | orchestrator | 2025-09-27 03:29:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:08.141061 | orchestrator | 2025-09-27 03:29:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:08.141091 | orchestrator | 2025-09-27 03:29:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:11.187197 | orchestrator | 2025-09-27 03:29:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:11.189228 | orchestrator | 2025-09-27 03:29:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:11.189256 | orchestrator | 2025-09-27 03:29:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:14.234290 | orchestrator | 2025-09-27 03:29:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:14.235822 | orchestrator | 2025-09-27 03:29:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:14.237095 | orchestrator | 2025-09-27 03:29:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:17.283561 | orchestrator | 2025-09-27 03:29:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:17.284782 | orchestrator | 2025-09-27 03:29:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:17.284926 | orchestrator | 2025-09-27 03:29:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:20.329164 | orchestrator | 2025-09-27 03:29:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:20.332034 | orchestrator | 2025-09-27 03:29:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:20.332070 | orchestrator | 2025-09-27 03:29:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:23.389682 | orchestrator | 2025-09-27 03:29:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:23.392715 | orchestrator | 2025-09-27 03:29:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:23.392860 | orchestrator | 2025-09-27 03:29:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:26.443670 | orchestrator | 2025-09-27 03:29:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:26.446271 | orchestrator | 2025-09-27 03:29:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:26.446384 | orchestrator | 2025-09-27 03:29:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:29.499434 | orchestrator | 2025-09-27 03:29:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:29.501554 | orchestrator | 2025-09-27 03:29:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:29.501583 | orchestrator | 2025-09-27 03:29:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:32.555641 | orchestrator | 2025-09-27 03:29:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:32.557708 | orchestrator | 2025-09-27 03:29:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:32.557745 | orchestrator | 2025-09-27 03:29:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:35.606565 | orchestrator | 2025-09-27 03:29:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:35.607527 | orchestrator | 2025-09-27 03:29:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:35.607603 | orchestrator | 2025-09-27 03:29:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:38.652205 | orchestrator | 2025-09-27 03:29:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:38.653696 | orchestrator | 2025-09-27 03:29:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:38.653724 | orchestrator | 2025-09-27 03:29:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:41.701186 | orchestrator | 2025-09-27 03:29:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:41.703304 | orchestrator | 2025-09-27 03:29:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:41.703387 | orchestrator | 2025-09-27 03:29:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:44.745506 | orchestrator | 2025-09-27 03:29:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:44.746673 | orchestrator | 2025-09-27 03:29:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:44.746765 | orchestrator | 2025-09-27 03:29:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:47.802416 | orchestrator | 2025-09-27 03:29:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:47.804089 | orchestrator | 2025-09-27 03:29:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:47.804135 | orchestrator | 2025-09-27 03:29:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:50.846958 | orchestrator | 2025-09-27 03:29:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:50.848633 | orchestrator | 2025-09-27 03:29:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:50.849228 | orchestrator | 2025-09-27 03:29:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:53.893502 | orchestrator | 2025-09-27 03:29:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:53.895255 | orchestrator | 2025-09-27 03:29:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:53.895346 | orchestrator | 2025-09-27 03:29:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:56.939495 | orchestrator | 2025-09-27 03:29:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:56.940279 | orchestrator | 2025-09-27 03:29:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:56.940310 | orchestrator | 2025-09-27 03:29:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:29:59.987322 | orchestrator | 2025-09-27 03:29:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:29:59.989466 | orchestrator | 2025-09-27 03:29:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:29:59.989655 | orchestrator | 2025-09-27 03:29:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:03.036086 | orchestrator | 2025-09-27 03:30:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:03.037155 | orchestrator | 2025-09-27 03:30:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:03.037178 | orchestrator | 2025-09-27 03:30:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:06.083003 | orchestrator | 2025-09-27 03:30:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:06.084468 | orchestrator | 2025-09-27 03:30:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:06.084512 | orchestrator | 2025-09-27 03:30:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:09.125242 | orchestrator | 2025-09-27 03:30:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:09.129202 | orchestrator | 2025-09-27 03:30:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:09.129231 | orchestrator | 2025-09-27 03:30:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:12.175900 | orchestrator | 2025-09-27 03:30:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:12.177299 | orchestrator | 2025-09-27 03:30:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:12.177324 | orchestrator | 2025-09-27 03:30:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:15.226158 | orchestrator | 2025-09-27 03:30:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:15.229521 | orchestrator | 2025-09-27 03:30:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:15.229650 | orchestrator | 2025-09-27 03:30:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:18.280614 | orchestrator | 2025-09-27 03:30:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:18.283922 | orchestrator | 2025-09-27 03:30:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:18.284066 | orchestrator | 2025-09-27 03:30:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:21.333854 | orchestrator | 2025-09-27 03:30:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:21.334455 | orchestrator | 2025-09-27 03:30:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:21.334575 | orchestrator | 2025-09-27 03:30:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:24.380430 | orchestrator | 2025-09-27 03:30:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:24.381335 | orchestrator | 2025-09-27 03:30:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:24.381414 | orchestrator | 2025-09-27 03:30:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:27.428777 | orchestrator | 2025-09-27 03:30:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:27.430312 | orchestrator | 2025-09-27 03:30:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:27.430389 | orchestrator | 2025-09-27 03:30:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:30.475908 | orchestrator | 2025-09-27 03:30:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:30.476285 | orchestrator | 2025-09-27 03:30:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:30.476571 | orchestrator | 2025-09-27 03:30:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:33.530598 | orchestrator | 2025-09-27 03:30:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:33.531488 | orchestrator | 2025-09-27 03:30:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:33.531758 | orchestrator | 2025-09-27 03:30:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:36.577036 | orchestrator | 2025-09-27 03:30:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:36.578888 | orchestrator | 2025-09-27 03:30:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:36.578915 | orchestrator | 2025-09-27 03:30:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:39.627009 | orchestrator | 2025-09-27 03:30:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:39.629796 | orchestrator | 2025-09-27 03:30:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:39.629902 | orchestrator | 2025-09-27 03:30:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:42.680020 | orchestrator | 2025-09-27 03:30:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:42.681017 | orchestrator | 2025-09-27 03:30:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:42.681203 | orchestrator | 2025-09-27 03:30:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:45.725464 | orchestrator | 2025-09-27 03:30:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:45.727185 | orchestrator | 2025-09-27 03:30:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:45.727214 | orchestrator | 2025-09-27 03:30:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:48.774477 | orchestrator | 2025-09-27 03:30:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:48.776148 | orchestrator | 2025-09-27 03:30:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:48.776224 | orchestrator | 2025-09-27 03:30:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:51.819401 | orchestrator | 2025-09-27 03:30:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:51.820229 | orchestrator | 2025-09-27 03:30:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:51.820248 | orchestrator | 2025-09-27 03:30:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:54.864095 | orchestrator | 2025-09-27 03:30:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:54.866519 | orchestrator | 2025-09-27 03:30:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:54.866920 | orchestrator | 2025-09-27 03:30:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:30:57.910655 | orchestrator | 2025-09-27 03:30:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:30:57.911579 | orchestrator | 2025-09-27 03:30:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:30:57.911822 | orchestrator | 2025-09-27 03:30:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:00.955018 | orchestrator | 2025-09-27 03:31:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:00.956033 | orchestrator | 2025-09-27 03:31:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:00.956051 | orchestrator | 2025-09-27 03:31:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:04.006361 | orchestrator | 2025-09-27 03:31:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:04.009412 | orchestrator | 2025-09-27 03:31:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:04.010137 | orchestrator | 2025-09-27 03:31:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:07.057210 | orchestrator | 2025-09-27 03:31:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:07.059163 | orchestrator | 2025-09-27 03:31:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:07.059191 | orchestrator | 2025-09-27 03:31:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:10.104352 | orchestrator | 2025-09-27 03:31:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:10.105173 | orchestrator | 2025-09-27 03:31:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:10.105249 | orchestrator | 2025-09-27 03:31:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:13.153431 | orchestrator | 2025-09-27 03:31:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:13.154385 | orchestrator | 2025-09-27 03:31:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:13.154655 | orchestrator | 2025-09-27 03:31:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:16.206217 | orchestrator | 2025-09-27 03:31:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:16.206842 | orchestrator | 2025-09-27 03:31:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:16.207293 | orchestrator | 2025-09-27 03:31:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:19.255366 | orchestrator | 2025-09-27 03:31:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:19.256294 | orchestrator | 2025-09-27 03:31:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:19.256325 | orchestrator | 2025-09-27 03:31:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:22.310639 | orchestrator | 2025-09-27 03:31:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:22.311786 | orchestrator | 2025-09-27 03:31:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:22.311898 | orchestrator | 2025-09-27 03:31:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:25.358613 | orchestrator | 2025-09-27 03:31:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:25.359653 | orchestrator | 2025-09-27 03:31:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:25.359715 | orchestrator | 2025-09-27 03:31:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:28.405255 | orchestrator | 2025-09-27 03:31:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:28.406292 | orchestrator | 2025-09-27 03:31:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:28.406320 | orchestrator | 2025-09-27 03:31:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:31.451859 | orchestrator | 2025-09-27 03:31:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:31.453342 | orchestrator | 2025-09-27 03:31:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:31.453415 | orchestrator | 2025-09-27 03:31:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:34.505708 | orchestrator | 2025-09-27 03:31:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:34.506401 | orchestrator | 2025-09-27 03:31:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:34.506431 | orchestrator | 2025-09-27 03:31:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:37.556226 | orchestrator | 2025-09-27 03:31:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:37.557164 | orchestrator | 2025-09-27 03:31:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:37.557297 | orchestrator | 2025-09-27 03:31:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:40.605470 | orchestrator | 2025-09-27 03:31:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:40.606106 | orchestrator | 2025-09-27 03:31:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:40.606137 | orchestrator | 2025-09-27 03:31:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:43.657283 | orchestrator | 2025-09-27 03:31:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:43.658880 | orchestrator | 2025-09-27 03:31:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:43.659002 | orchestrator | 2025-09-27 03:31:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:46.703474 | orchestrator | 2025-09-27 03:31:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:46.706215 | orchestrator | 2025-09-27 03:31:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:46.706520 | orchestrator | 2025-09-27 03:31:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:49.750384 | orchestrator | 2025-09-27 03:31:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:49.752140 | orchestrator | 2025-09-27 03:31:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:49.752382 | orchestrator | 2025-09-27 03:31:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:52.800427 | orchestrator | 2025-09-27 03:31:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:52.801038 | orchestrator | 2025-09-27 03:31:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:52.801137 | orchestrator | 2025-09-27 03:31:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:55.844503 | orchestrator | 2025-09-27 03:31:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:55.846133 | orchestrator | 2025-09-27 03:31:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:55.846237 | orchestrator | 2025-09-27 03:31:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:31:58.890261 | orchestrator | 2025-09-27 03:31:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:31:58.891510 | orchestrator | 2025-09-27 03:31:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:31:58.891534 | orchestrator | 2025-09-27 03:31:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:01.934300 | orchestrator | 2025-09-27 03:32:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:01.937076 | orchestrator | 2025-09-27 03:32:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:01.937564 | orchestrator | 2025-09-27 03:32:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:04.986471 | orchestrator | 2025-09-27 03:32:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:04.989206 | orchestrator | 2025-09-27 03:32:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:04.989234 | orchestrator | 2025-09-27 03:32:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:08.039317 | orchestrator | 2025-09-27 03:32:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:08.040484 | orchestrator | 2025-09-27 03:32:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:08.040593 | orchestrator | 2025-09-27 03:32:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:11.083292 | orchestrator | 2025-09-27 03:32:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:11.084777 | orchestrator | 2025-09-27 03:32:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:11.084809 | orchestrator | 2025-09-27 03:32:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:14.133211 | orchestrator | 2025-09-27 03:32:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:14.134887 | orchestrator | 2025-09-27 03:32:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:14.134947 | orchestrator | 2025-09-27 03:32:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:17.181561 | orchestrator | 2025-09-27 03:32:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:17.182449 | orchestrator | 2025-09-27 03:32:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:17.182477 | orchestrator | 2025-09-27 03:32:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:20.228983 | orchestrator | 2025-09-27 03:32:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:20.229233 | orchestrator | 2025-09-27 03:32:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:20.229257 | orchestrator | 2025-09-27 03:32:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:23.280790 | orchestrator | 2025-09-27 03:32:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:23.282580 | orchestrator | 2025-09-27 03:32:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:23.282735 | orchestrator | 2025-09-27 03:32:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:26.330093 | orchestrator | 2025-09-27 03:32:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:26.332127 | orchestrator | 2025-09-27 03:32:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:26.332160 | orchestrator | 2025-09-27 03:32:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:29.379010 | orchestrator | 2025-09-27 03:32:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:29.380348 | orchestrator | 2025-09-27 03:32:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:29.380381 | orchestrator | 2025-09-27 03:32:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:32.431148 | orchestrator | 2025-09-27 03:32:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:32.432284 | orchestrator | 2025-09-27 03:32:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:32.432575 | orchestrator | 2025-09-27 03:32:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:35.482550 | orchestrator | 2025-09-27 03:32:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:35.484036 | orchestrator | 2025-09-27 03:32:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:35.484062 | orchestrator | 2025-09-27 03:32:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:38.533726 | orchestrator | 2025-09-27 03:32:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:38.534348 | orchestrator | 2025-09-27 03:32:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:38.534379 | orchestrator | 2025-09-27 03:32:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:41.582470 | orchestrator | 2025-09-27 03:32:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:41.584024 | orchestrator | 2025-09-27 03:32:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:41.584053 | orchestrator | 2025-09-27 03:32:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:44.622809 | orchestrator | 2025-09-27 03:32:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:44.623883 | orchestrator | 2025-09-27 03:32:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:44.623932 | orchestrator | 2025-09-27 03:32:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:47.671020 | orchestrator | 2025-09-27 03:32:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:47.672231 | orchestrator | 2025-09-27 03:32:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:47.672655 | orchestrator | 2025-09-27 03:32:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:50.719653 | orchestrator | 2025-09-27 03:32:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:50.721188 | orchestrator | 2025-09-27 03:32:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:50.721215 | orchestrator | 2025-09-27 03:32:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:53.768391 | orchestrator | 2025-09-27 03:32:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:53.770148 | orchestrator | 2025-09-27 03:32:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:53.770219 | orchestrator | 2025-09-27 03:32:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:56.821478 | orchestrator | 2025-09-27 03:32:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:56.822868 | orchestrator | 2025-09-27 03:32:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:56.823134 | orchestrator | 2025-09-27 03:32:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:32:59.867435 | orchestrator | 2025-09-27 03:32:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:32:59.868964 | orchestrator | 2025-09-27 03:32:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:32:59.869236 | orchestrator | 2025-09-27 03:32:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:02.919771 | orchestrator | 2025-09-27 03:33:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:02.921411 | orchestrator | 2025-09-27 03:33:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:02.921436 | orchestrator | 2025-09-27 03:33:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:05.964306 | orchestrator | 2025-09-27 03:33:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:05.966330 | orchestrator | 2025-09-27 03:33:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:05.966361 | orchestrator | 2025-09-27 03:33:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:09.011620 | orchestrator | 2025-09-27 03:33:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:09.012565 | orchestrator | 2025-09-27 03:33:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:09.012592 | orchestrator | 2025-09-27 03:33:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:12.055724 | orchestrator | 2025-09-27 03:33:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:12.057126 | orchestrator | 2025-09-27 03:33:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:12.057197 | orchestrator | 2025-09-27 03:33:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:15.103042 | orchestrator | 2025-09-27 03:33:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:15.104091 | orchestrator | 2025-09-27 03:33:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:15.104120 | orchestrator | 2025-09-27 03:33:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:18.151950 | orchestrator | 2025-09-27 03:33:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:18.153104 | orchestrator | 2025-09-27 03:33:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:18.153171 | orchestrator | 2025-09-27 03:33:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:21.206208 | orchestrator | 2025-09-27 03:33:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:21.208152 | orchestrator | 2025-09-27 03:33:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:21.208180 | orchestrator | 2025-09-27 03:33:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:24.249457 | orchestrator | 2025-09-27 03:33:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:24.250897 | orchestrator | 2025-09-27 03:33:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:24.251290 | orchestrator | 2025-09-27 03:33:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:27.301441 | orchestrator | 2025-09-27 03:33:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:27.303133 | orchestrator | 2025-09-27 03:33:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:27.303163 | orchestrator | 2025-09-27 03:33:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:30.349689 | orchestrator | 2025-09-27 03:33:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:30.351121 | orchestrator | 2025-09-27 03:33:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:30.351164 | orchestrator | 2025-09-27 03:33:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:33.395637 | orchestrator | 2025-09-27 03:33:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:33.397944 | orchestrator | 2025-09-27 03:33:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:33.397982 | orchestrator | 2025-09-27 03:33:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:36.442578 | orchestrator | 2025-09-27 03:33:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:36.443712 | orchestrator | 2025-09-27 03:33:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:36.443740 | orchestrator | 2025-09-27 03:33:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:39.487192 | orchestrator | 2025-09-27 03:33:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:39.487727 | orchestrator | 2025-09-27 03:33:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:39.487755 | orchestrator | 2025-09-27 03:33:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:42.530350 | orchestrator | 2025-09-27 03:33:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:42.532151 | orchestrator | 2025-09-27 03:33:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:42.532190 | orchestrator | 2025-09-27 03:33:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:45.579443 | orchestrator | 2025-09-27 03:33:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:45.580518 | orchestrator | 2025-09-27 03:33:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:45.580745 | orchestrator | 2025-09-27 03:33:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:48.626248 | orchestrator | 2025-09-27 03:33:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:48.627016 | orchestrator | 2025-09-27 03:33:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:48.627101 | orchestrator | 2025-09-27 03:33:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:51.684164 | orchestrator | 2025-09-27 03:33:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:51.684794 | orchestrator | 2025-09-27 03:33:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:51.685001 | orchestrator | 2025-09-27 03:33:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:54.722006 | orchestrator | 2025-09-27 03:33:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:54.724147 | orchestrator | 2025-09-27 03:33:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:54.724181 | orchestrator | 2025-09-27 03:33:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:33:57.766586 | orchestrator | 2025-09-27 03:33:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:33:57.768397 | orchestrator | 2025-09-27 03:33:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:33:57.768727 | orchestrator | 2025-09-27 03:33:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:00.810070 | orchestrator | 2025-09-27 03:34:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:00.811982 | orchestrator | 2025-09-27 03:34:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:00.812013 | orchestrator | 2025-09-27 03:34:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:03.857859 | orchestrator | 2025-09-27 03:34:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:03.859574 | orchestrator | 2025-09-27 03:34:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:03.859923 | orchestrator | 2025-09-27 03:34:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:06.901761 | orchestrator | 2025-09-27 03:34:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:06.902720 | orchestrator | 2025-09-27 03:34:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:06.902749 | orchestrator | 2025-09-27 03:34:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:09.957926 | orchestrator | 2025-09-27 03:34:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:09.959111 | orchestrator | 2025-09-27 03:34:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:09.959141 | orchestrator | 2025-09-27 03:34:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:13.002680 | orchestrator | 2025-09-27 03:34:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:13.005592 | orchestrator | 2025-09-27 03:34:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:13.005622 | orchestrator | 2025-09-27 03:34:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:16.057561 | orchestrator | 2025-09-27 03:34:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:16.058542 | orchestrator | 2025-09-27 03:34:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:16.058568 | orchestrator | 2025-09-27 03:34:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:19.109482 | orchestrator | 2025-09-27 03:34:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:19.110772 | orchestrator | 2025-09-27 03:34:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:19.110799 | orchestrator | 2025-09-27 03:34:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:22.165050 | orchestrator | 2025-09-27 03:34:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:22.169050 | orchestrator | 2025-09-27 03:34:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:22.169073 | orchestrator | 2025-09-27 03:34:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:25.214102 | orchestrator | 2025-09-27 03:34:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:25.214657 | orchestrator | 2025-09-27 03:34:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:25.214755 | orchestrator | 2025-09-27 03:34:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:28.262682 | orchestrator | 2025-09-27 03:34:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:28.263773 | orchestrator | 2025-09-27 03:34:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:28.263802 | orchestrator | 2025-09-27 03:34:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:31.313780 | orchestrator | 2025-09-27 03:34:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:31.315399 | orchestrator | 2025-09-27 03:34:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:31.315427 | orchestrator | 2025-09-27 03:34:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:34.363506 | orchestrator | 2025-09-27 03:34:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:34.364864 | orchestrator | 2025-09-27 03:34:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:34.364976 | orchestrator | 2025-09-27 03:34:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:37.416145 | orchestrator | 2025-09-27 03:34:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:37.417023 | orchestrator | 2025-09-27 03:34:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:37.417098 | orchestrator | 2025-09-27 03:34:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:40.465380 | orchestrator | 2025-09-27 03:34:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:40.467468 | orchestrator | 2025-09-27 03:34:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:40.467778 | orchestrator | 2025-09-27 03:34:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:43.518366 | orchestrator | 2025-09-27 03:34:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:43.518981 | orchestrator | 2025-09-27 03:34:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:43.519015 | orchestrator | 2025-09-27 03:34:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:46.564957 | orchestrator | 2025-09-27 03:34:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:46.566877 | orchestrator | 2025-09-27 03:34:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:46.566992 | orchestrator | 2025-09-27 03:34:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:49.614995 | orchestrator | 2025-09-27 03:34:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:49.616343 | orchestrator | 2025-09-27 03:34:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:49.616367 | orchestrator | 2025-09-27 03:34:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:52.666304 | orchestrator | 2025-09-27 03:34:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:52.667849 | orchestrator | 2025-09-27 03:34:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:52.667987 | orchestrator | 2025-09-27 03:34:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:55.715719 | orchestrator | 2025-09-27 03:34:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:55.717100 | orchestrator | 2025-09-27 03:34:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:55.717127 | orchestrator | 2025-09-27 03:34:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:34:58.763078 | orchestrator | 2025-09-27 03:34:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:34:58.764407 | orchestrator | 2025-09-27 03:34:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:34:58.764444 | orchestrator | 2025-09-27 03:34:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:01.813165 | orchestrator | 2025-09-27 03:35:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:01.814594 | orchestrator | 2025-09-27 03:35:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:01.814620 | orchestrator | 2025-09-27 03:35:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:04.860751 | orchestrator | 2025-09-27 03:35:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:04.862529 | orchestrator | 2025-09-27 03:35:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:04.862770 | orchestrator | 2025-09-27 03:35:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:07.906494 | orchestrator | 2025-09-27 03:35:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:07.908393 | orchestrator | 2025-09-27 03:35:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:07.908433 | orchestrator | 2025-09-27 03:35:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:10.956670 | orchestrator | 2025-09-27 03:35:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:10.960574 | orchestrator | 2025-09-27 03:35:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:10.960609 | orchestrator | 2025-09-27 03:35:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:14.018004 | orchestrator | 2025-09-27 03:35:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:14.020453 | orchestrator | 2025-09-27 03:35:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:14.020499 | orchestrator | 2025-09-27 03:35:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:17.061266 | orchestrator | 2025-09-27 03:35:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:17.062358 | orchestrator | 2025-09-27 03:35:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:17.062388 | orchestrator | 2025-09-27 03:35:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:20.106244 | orchestrator | 2025-09-27 03:35:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:20.108105 | orchestrator | 2025-09-27 03:35:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:20.108142 | orchestrator | 2025-09-27 03:35:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:23.158585 | orchestrator | 2025-09-27 03:35:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:23.160608 | orchestrator | 2025-09-27 03:35:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:23.160636 | orchestrator | 2025-09-27 03:35:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:26.212221 | orchestrator | 2025-09-27 03:35:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:26.213396 | orchestrator | 2025-09-27 03:35:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:26.213424 | orchestrator | 2025-09-27 03:35:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:29.263930 | orchestrator | 2025-09-27 03:35:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:29.265791 | orchestrator | 2025-09-27 03:35:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:29.265819 | orchestrator | 2025-09-27 03:35:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:32.309796 | orchestrator | 2025-09-27 03:35:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:32.311681 | orchestrator | 2025-09-27 03:35:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:32.312019 | orchestrator | 2025-09-27 03:35:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:35.358271 | orchestrator | 2025-09-27 03:35:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:35.359963 | orchestrator | 2025-09-27 03:35:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:35.360040 | orchestrator | 2025-09-27 03:35:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:38.406600 | orchestrator | 2025-09-27 03:35:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:38.408057 | orchestrator | 2025-09-27 03:35:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:38.408085 | orchestrator | 2025-09-27 03:35:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:41.457613 | orchestrator | 2025-09-27 03:35:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:41.458821 | orchestrator | 2025-09-27 03:35:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:41.458988 | orchestrator | 2025-09-27 03:35:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:44.501416 | orchestrator | 2025-09-27 03:35:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:44.502970 | orchestrator | 2025-09-27 03:35:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:44.503000 | orchestrator | 2025-09-27 03:35:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:47.548646 | orchestrator | 2025-09-27 03:35:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:47.550596 | orchestrator | 2025-09-27 03:35:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:47.550625 | orchestrator | 2025-09-27 03:35:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:50.596497 | orchestrator | 2025-09-27 03:35:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:50.598343 | orchestrator | 2025-09-27 03:35:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:50.598681 | orchestrator | 2025-09-27 03:35:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:53.647827 | orchestrator | 2025-09-27 03:35:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:53.648754 | orchestrator | 2025-09-27 03:35:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:53.648800 | orchestrator | 2025-09-27 03:35:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:56.694570 | orchestrator | 2025-09-27 03:35:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:56.696560 | orchestrator | 2025-09-27 03:35:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:56.696592 | orchestrator | 2025-09-27 03:35:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:35:59.736554 | orchestrator | 2025-09-27 03:35:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:35:59.738505 | orchestrator | 2025-09-27 03:35:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:35:59.738534 | orchestrator | 2025-09-27 03:35:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:02.787090 | orchestrator | 2025-09-27 03:36:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:02.789474 | orchestrator | 2025-09-27 03:36:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:02.789498 | orchestrator | 2025-09-27 03:36:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:05.840031 | orchestrator | 2025-09-27 03:36:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:05.842150 | orchestrator | 2025-09-27 03:36:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:05.842197 | orchestrator | 2025-09-27 03:36:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:08.889370 | orchestrator | 2025-09-27 03:36:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:08.891706 | orchestrator | 2025-09-27 03:36:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:08.891797 | orchestrator | 2025-09-27 03:36:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:11.940242 | orchestrator | 2025-09-27 03:36:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:11.942523 | orchestrator | 2025-09-27 03:36:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:11.942632 | orchestrator | 2025-09-27 03:36:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:14.985843 | orchestrator | 2025-09-27 03:36:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:14.988356 | orchestrator | 2025-09-27 03:36:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:14.988476 | orchestrator | 2025-09-27 03:36:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:18.033028 | orchestrator | 2025-09-27 03:36:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:18.034434 | orchestrator | 2025-09-27 03:36:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:18.034468 | orchestrator | 2025-09-27 03:36:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:21.083808 | orchestrator | 2025-09-27 03:36:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:21.085444 | orchestrator | 2025-09-27 03:36:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:21.085481 | orchestrator | 2025-09-27 03:36:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:24.134669 | orchestrator | 2025-09-27 03:36:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:24.134770 | orchestrator | 2025-09-27 03:36:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:24.134783 | orchestrator | 2025-09-27 03:36:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:27.179821 | orchestrator | 2025-09-27 03:36:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:27.181365 | orchestrator | 2025-09-27 03:36:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:27.181397 | orchestrator | 2025-09-27 03:36:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:30.226697 | orchestrator | 2025-09-27 03:36:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:30.228233 | orchestrator | 2025-09-27 03:36:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:30.228437 | orchestrator | 2025-09-27 03:36:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:33.276942 | orchestrator | 2025-09-27 03:36:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:33.278475 | orchestrator | 2025-09-27 03:36:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:33.278748 | orchestrator | 2025-09-27 03:36:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:36.321776 | orchestrator | 2025-09-27 03:36:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:36.323965 | orchestrator | 2025-09-27 03:36:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:36.324060 | orchestrator | 2025-09-27 03:36:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:39.371733 | orchestrator | 2025-09-27 03:36:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:39.373515 | orchestrator | 2025-09-27 03:36:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:39.373565 | orchestrator | 2025-09-27 03:36:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:42.413042 | orchestrator | 2025-09-27 03:36:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:42.414494 | orchestrator | 2025-09-27 03:36:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:42.414746 | orchestrator | 2025-09-27 03:36:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:45.463271 | orchestrator | 2025-09-27 03:36:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:45.464004 | orchestrator | 2025-09-27 03:36:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:45.464038 | orchestrator | 2025-09-27 03:36:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:48.510632 | orchestrator | 2025-09-27 03:36:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:48.513072 | orchestrator | 2025-09-27 03:36:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:48.513210 | orchestrator | 2025-09-27 03:36:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:51.561115 | orchestrator | 2025-09-27 03:36:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:51.562119 | orchestrator | 2025-09-27 03:36:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:51.562154 | orchestrator | 2025-09-27 03:36:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:54.607113 | orchestrator | 2025-09-27 03:36:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:54.608055 | orchestrator | 2025-09-27 03:36:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:54.608167 | orchestrator | 2025-09-27 03:36:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:36:57.654198 | orchestrator | 2025-09-27 03:36:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:36:57.655424 | orchestrator | 2025-09-27 03:36:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:36:57.655504 | orchestrator | 2025-09-27 03:36:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:00.702615 | orchestrator | 2025-09-27 03:37:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:00.704345 | orchestrator | 2025-09-27 03:37:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:00.704771 | orchestrator | 2025-09-27 03:37:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:03.752659 | orchestrator | 2025-09-27 03:37:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:03.753944 | orchestrator | 2025-09-27 03:37:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:03.753978 | orchestrator | 2025-09-27 03:37:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:06.797826 | orchestrator | 2025-09-27 03:37:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:06.799816 | orchestrator | 2025-09-27 03:37:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:06.799950 | orchestrator | 2025-09-27 03:37:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:09.853775 | orchestrator | 2025-09-27 03:37:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:09.855517 | orchestrator | 2025-09-27 03:37:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:09.855559 | orchestrator | 2025-09-27 03:37:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:12.906603 | orchestrator | 2025-09-27 03:37:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:12.908909 | orchestrator | 2025-09-27 03:37:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:12.908940 | orchestrator | 2025-09-27 03:37:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:15.957285 | orchestrator | 2025-09-27 03:37:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:15.960012 | orchestrator | 2025-09-27 03:37:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:15.960046 | orchestrator | 2025-09-27 03:37:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:19.007606 | orchestrator | 2025-09-27 03:37:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:19.010377 | orchestrator | 2025-09-27 03:37:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:19.010465 | orchestrator | 2025-09-27 03:37:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:22.058496 | orchestrator | 2025-09-27 03:37:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:22.060513 | orchestrator | 2025-09-27 03:37:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:22.060587 | orchestrator | 2025-09-27 03:37:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:25.106912 | orchestrator | 2025-09-27 03:37:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:25.108130 | orchestrator | 2025-09-27 03:37:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:25.108166 | orchestrator | 2025-09-27 03:37:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:28.160566 | orchestrator | 2025-09-27 03:37:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:28.161555 | orchestrator | 2025-09-27 03:37:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:28.161761 | orchestrator | 2025-09-27 03:37:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:31.202658 | orchestrator | 2025-09-27 03:37:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:31.204302 | orchestrator | 2025-09-27 03:37:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:31.204360 | orchestrator | 2025-09-27 03:37:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:34.246218 | orchestrator | 2025-09-27 03:37:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:34.247372 | orchestrator | 2025-09-27 03:37:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:34.247471 | orchestrator | 2025-09-27 03:37:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:37.297704 | orchestrator | 2025-09-27 03:37:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:37.299394 | orchestrator | 2025-09-27 03:37:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:37.299432 | orchestrator | 2025-09-27 03:37:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:40.342697 | orchestrator | 2025-09-27 03:37:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:40.343231 | orchestrator | 2025-09-27 03:37:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:40.343265 | orchestrator | 2025-09-27 03:37:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:43.389336 | orchestrator | 2025-09-27 03:37:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:43.390272 | orchestrator | 2025-09-27 03:37:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:43.390431 | orchestrator | 2025-09-27 03:37:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:46.429932 | orchestrator | 2025-09-27 03:37:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:46.431445 | orchestrator | 2025-09-27 03:37:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:46.431553 | orchestrator | 2025-09-27 03:37:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:49.483302 | orchestrator | 2025-09-27 03:37:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:49.484373 | orchestrator | 2025-09-27 03:37:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:49.484403 | orchestrator | 2025-09-27 03:37:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:52.527111 | orchestrator | 2025-09-27 03:37:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:52.528234 | orchestrator | 2025-09-27 03:37:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:52.528266 | orchestrator | 2025-09-27 03:37:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:55.576027 | orchestrator | 2025-09-27 03:37:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:55.577650 | orchestrator | 2025-09-27 03:37:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:55.578097 | orchestrator | 2025-09-27 03:37:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:37:58.627414 | orchestrator | 2025-09-27 03:37:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:37:58.628219 | orchestrator | 2025-09-27 03:37:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:37:58.628419 | orchestrator | 2025-09-27 03:37:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:01.678131 | orchestrator | 2025-09-27 03:38:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:01.680652 | orchestrator | 2025-09-27 03:38:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:01.680789 | orchestrator | 2025-09-27 03:38:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:04.733827 | orchestrator | 2025-09-27 03:38:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:04.735357 | orchestrator | 2025-09-27 03:38:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:04.735385 | orchestrator | 2025-09-27 03:38:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:07.783668 | orchestrator | 2025-09-27 03:38:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:07.784713 | orchestrator | 2025-09-27 03:38:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:07.784745 | orchestrator | 2025-09-27 03:38:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:10.830338 | orchestrator | 2025-09-27 03:38:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:10.833072 | orchestrator | 2025-09-27 03:38:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:10.833105 | orchestrator | 2025-09-27 03:38:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:13.885914 | orchestrator | 2025-09-27 03:38:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:13.887227 | orchestrator | 2025-09-27 03:38:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:13.887480 | orchestrator | 2025-09-27 03:38:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:16.936175 | orchestrator | 2025-09-27 03:38:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:16.937145 | orchestrator | 2025-09-27 03:38:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:16.937176 | orchestrator | 2025-09-27 03:38:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:19.979820 | orchestrator | 2025-09-27 03:38:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:19.982175 | orchestrator | 2025-09-27 03:38:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:19.982241 | orchestrator | 2025-09-27 03:38:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:23.023316 | orchestrator | 2025-09-27 03:38:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:23.026432 | orchestrator | 2025-09-27 03:38:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:23.026470 | orchestrator | 2025-09-27 03:38:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:26.070953 | orchestrator | 2025-09-27 03:38:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:26.074678 | orchestrator | 2025-09-27 03:38:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:26.074878 | orchestrator | 2025-09-27 03:38:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:29.130682 | orchestrator | 2025-09-27 03:38:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:29.132810 | orchestrator | 2025-09-27 03:38:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:29.132991 | orchestrator | 2025-09-27 03:38:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:32.181975 | orchestrator | 2025-09-27 03:38:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:32.183257 | orchestrator | 2025-09-27 03:38:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:32.183291 | orchestrator | 2025-09-27 03:38:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:35.229034 | orchestrator | 2025-09-27 03:38:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:35.229625 | orchestrator | 2025-09-27 03:38:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:35.229660 | orchestrator | 2025-09-27 03:38:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:38.275562 | orchestrator | 2025-09-27 03:38:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:38.275877 | orchestrator | 2025-09-27 03:38:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:38.275894 | orchestrator | 2025-09-27 03:38:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:41.320823 | orchestrator | 2025-09-27 03:38:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:41.322166 | orchestrator | 2025-09-27 03:38:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:41.322362 | orchestrator | 2025-09-27 03:38:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:44.372929 | orchestrator | 2025-09-27 03:38:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:44.373347 | orchestrator | 2025-09-27 03:38:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:44.373462 | orchestrator | 2025-09-27 03:38:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:47.416359 | orchestrator | 2025-09-27 03:38:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:47.417209 | orchestrator | 2025-09-27 03:38:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:47.417237 | orchestrator | 2025-09-27 03:38:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:50.464695 | orchestrator | 2025-09-27 03:38:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:50.466559 | orchestrator | 2025-09-27 03:38:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:50.466798 | orchestrator | 2025-09-27 03:38:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:53.515481 | orchestrator | 2025-09-27 03:38:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:53.516307 | orchestrator | 2025-09-27 03:38:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:53.516491 | orchestrator | 2025-09-27 03:38:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:56.564245 | orchestrator | 2025-09-27 03:38:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:56.565547 | orchestrator | 2025-09-27 03:38:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:56.565580 | orchestrator | 2025-09-27 03:38:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:38:59.613644 | orchestrator | 2025-09-27 03:38:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:38:59.614459 | orchestrator | 2025-09-27 03:38:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:38:59.614496 | orchestrator | 2025-09-27 03:38:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:02.654655 | orchestrator | 2025-09-27 03:39:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:02.656285 | orchestrator | 2025-09-27 03:39:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:02.656452 | orchestrator | 2025-09-27 03:39:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:05.703332 | orchestrator | 2025-09-27 03:39:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:05.705153 | orchestrator | 2025-09-27 03:39:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:05.705185 | orchestrator | 2025-09-27 03:39:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:08.748778 | orchestrator | 2025-09-27 03:39:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:08.750075 | orchestrator | 2025-09-27 03:39:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:08.750110 | orchestrator | 2025-09-27 03:39:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:11.790918 | orchestrator | 2025-09-27 03:39:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:11.792173 | orchestrator | 2025-09-27 03:39:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:11.792206 | orchestrator | 2025-09-27 03:39:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:14.846629 | orchestrator | 2025-09-27 03:39:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:14.849240 | orchestrator | 2025-09-27 03:39:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:14.849384 | orchestrator | 2025-09-27 03:39:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:17.896291 | orchestrator | 2025-09-27 03:39:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:17.897684 | orchestrator | 2025-09-27 03:39:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:17.897741 | orchestrator | 2025-09-27 03:39:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:20.944729 | orchestrator | 2025-09-27 03:39:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:20.946681 | orchestrator | 2025-09-27 03:39:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:20.946771 | orchestrator | 2025-09-27 03:39:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:23.996230 | orchestrator | 2025-09-27 03:39:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:23.997793 | orchestrator | 2025-09-27 03:39:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:23.997824 | orchestrator | 2025-09-27 03:39:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:27.040428 | orchestrator | 2025-09-27 03:39:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:27.041745 | orchestrator | 2025-09-27 03:39:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:27.041776 | orchestrator | 2025-09-27 03:39:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:30.084175 | orchestrator | 2025-09-27 03:39:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:30.085239 | orchestrator | 2025-09-27 03:39:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:30.085272 | orchestrator | 2025-09-27 03:39:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:33.129710 | orchestrator | 2025-09-27 03:39:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:33.131176 | orchestrator | 2025-09-27 03:39:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:33.131213 | orchestrator | 2025-09-27 03:39:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:36.181728 | orchestrator | 2025-09-27 03:39:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:36.182942 | orchestrator | 2025-09-27 03:39:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:36.182973 | orchestrator | 2025-09-27 03:39:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:39.233342 | orchestrator | 2025-09-27 03:39:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:39.235156 | orchestrator | 2025-09-27 03:39:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:39.235358 | orchestrator | 2025-09-27 03:39:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:42.287729 | orchestrator | 2025-09-27 03:39:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:42.289276 | orchestrator | 2025-09-27 03:39:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:42.289496 | orchestrator | 2025-09-27 03:39:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:45.335591 | orchestrator | 2025-09-27 03:39:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:45.337145 | orchestrator | 2025-09-27 03:39:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:45.337626 | orchestrator | 2025-09-27 03:39:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:48.388769 | orchestrator | 2025-09-27 03:39:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:48.390376 | orchestrator | 2025-09-27 03:39:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:48.390460 | orchestrator | 2025-09-27 03:39:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:51.438554 | orchestrator | 2025-09-27 03:39:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:51.440024 | orchestrator | 2025-09-27 03:39:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:51.440094 | orchestrator | 2025-09-27 03:39:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:54.484141 | orchestrator | 2025-09-27 03:39:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:54.485983 | orchestrator | 2025-09-27 03:39:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:54.486071 | orchestrator | 2025-09-27 03:39:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:39:57.533596 | orchestrator | 2025-09-27 03:39:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:39:57.535505 | orchestrator | 2025-09-27 03:39:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:39:57.535586 | orchestrator | 2025-09-27 03:39:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:00.577527 | orchestrator | 2025-09-27 03:40:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:00.579414 | orchestrator | 2025-09-27 03:40:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:00.579449 | orchestrator | 2025-09-27 03:40:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:03.622555 | orchestrator | 2025-09-27 03:40:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:03.624587 | orchestrator | 2025-09-27 03:40:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:03.624682 | orchestrator | 2025-09-27 03:40:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:06.673196 | orchestrator | 2025-09-27 03:40:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:06.674658 | orchestrator | 2025-09-27 03:40:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:06.674911 | orchestrator | 2025-09-27 03:40:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:09.717750 | orchestrator | 2025-09-27 03:40:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:09.721125 | orchestrator | 2025-09-27 03:40:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:09.721351 | orchestrator | 2025-09-27 03:40:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:12.769585 | orchestrator | 2025-09-27 03:40:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:12.771378 | orchestrator | 2025-09-27 03:40:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:12.772003 | orchestrator | 2025-09-27 03:40:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:15.820384 | orchestrator | 2025-09-27 03:40:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:15.822067 | orchestrator | 2025-09-27 03:40:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:15.822231 | orchestrator | 2025-09-27 03:40:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:18.865344 | orchestrator | 2025-09-27 03:40:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:18.866882 | orchestrator | 2025-09-27 03:40:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:18.866912 | orchestrator | 2025-09-27 03:40:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:21.915144 | orchestrator | 2025-09-27 03:40:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:21.917150 | orchestrator | 2025-09-27 03:40:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:21.917695 | orchestrator | 2025-09-27 03:40:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:24.966208 | orchestrator | 2025-09-27 03:40:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:24.966411 | orchestrator | 2025-09-27 03:40:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:24.966530 | orchestrator | 2025-09-27 03:40:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:28.014741 | orchestrator | 2025-09-27 03:40:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:28.017054 | orchestrator | 2025-09-27 03:40:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:28.017083 | orchestrator | 2025-09-27 03:40:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:31.061657 | orchestrator | 2025-09-27 03:40:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:31.062879 | orchestrator | 2025-09-27 03:40:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:31.062951 | orchestrator | 2025-09-27 03:40:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:34.105630 | orchestrator | 2025-09-27 03:40:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:34.107131 | orchestrator | 2025-09-27 03:40:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:34.107416 | orchestrator | 2025-09-27 03:40:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:37.154409 | orchestrator | 2025-09-27 03:40:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:37.156022 | orchestrator | 2025-09-27 03:40:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:37.156049 | orchestrator | 2025-09-27 03:40:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:40.199735 | orchestrator | 2025-09-27 03:40:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:40.200574 | orchestrator | 2025-09-27 03:40:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:40.200791 | orchestrator | 2025-09-27 03:40:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:43.249644 | orchestrator | 2025-09-27 03:40:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:43.251787 | orchestrator | 2025-09-27 03:40:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:43.251836 | orchestrator | 2025-09-27 03:40:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:46.299312 | orchestrator | 2025-09-27 03:40:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:46.300316 | orchestrator | 2025-09-27 03:40:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:46.300345 | orchestrator | 2025-09-27 03:40:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:49.343502 | orchestrator | 2025-09-27 03:40:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:49.345356 | orchestrator | 2025-09-27 03:40:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:49.345402 | orchestrator | 2025-09-27 03:40:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:52.387198 | orchestrator | 2025-09-27 03:40:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:52.389058 | orchestrator | 2025-09-27 03:40:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:52.389087 | orchestrator | 2025-09-27 03:40:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:55.434277 | orchestrator | 2025-09-27 03:40:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:55.437290 | orchestrator | 2025-09-27 03:40:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:55.437491 | orchestrator | 2025-09-27 03:40:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:40:58.486989 | orchestrator | 2025-09-27 03:40:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:40:58.489252 | orchestrator | 2025-09-27 03:40:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:40:58.489330 | orchestrator | 2025-09-27 03:40:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:01.541366 | orchestrator | 2025-09-27 03:41:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:01.543062 | orchestrator | 2025-09-27 03:41:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:01.543091 | orchestrator | 2025-09-27 03:41:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:04.585650 | orchestrator | 2025-09-27 03:41:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:04.587496 | orchestrator | 2025-09-27 03:41:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:04.587520 | orchestrator | 2025-09-27 03:41:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:07.633070 | orchestrator | 2025-09-27 03:41:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:07.634656 | orchestrator | 2025-09-27 03:41:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:07.634855 | orchestrator | 2025-09-27 03:41:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:10.680047 | orchestrator | 2025-09-27 03:41:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:10.682268 | orchestrator | 2025-09-27 03:41:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:10.682380 | orchestrator | 2025-09-27 03:41:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:13.726474 | orchestrator | 2025-09-27 03:41:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:13.727591 | orchestrator | 2025-09-27 03:41:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:13.727668 | orchestrator | 2025-09-27 03:41:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:16.776440 | orchestrator | 2025-09-27 03:41:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:16.778543 | orchestrator | 2025-09-27 03:41:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:16.778986 | orchestrator | 2025-09-27 03:41:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:19.821908 | orchestrator | 2025-09-27 03:41:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:19.823137 | orchestrator | 2025-09-27 03:41:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:19.823181 | orchestrator | 2025-09-27 03:41:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:22.869832 | orchestrator | 2025-09-27 03:41:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:22.871018 | orchestrator | 2025-09-27 03:41:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:22.871096 | orchestrator | 2025-09-27 03:41:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:25.915557 | orchestrator | 2025-09-27 03:41:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:25.916768 | orchestrator | 2025-09-27 03:41:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:25.916840 | orchestrator | 2025-09-27 03:41:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:28.964103 | orchestrator | 2025-09-27 03:41:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:28.965591 | orchestrator | 2025-09-27 03:41:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:28.965621 | orchestrator | 2025-09-27 03:41:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:32.013860 | orchestrator | 2025-09-27 03:41:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:32.016073 | orchestrator | 2025-09-27 03:41:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:32.016104 | orchestrator | 2025-09-27 03:41:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:35.061495 | orchestrator | 2025-09-27 03:41:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:35.062596 | orchestrator | 2025-09-27 03:41:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:35.062628 | orchestrator | 2025-09-27 03:41:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:38.104617 | orchestrator | 2025-09-27 03:41:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:38.106226 | orchestrator | 2025-09-27 03:41:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:38.106261 | orchestrator | 2025-09-27 03:41:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:41.157709 | orchestrator | 2025-09-27 03:41:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:41.159111 | orchestrator | 2025-09-27 03:41:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:41.159156 | orchestrator | 2025-09-27 03:41:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:44.210650 | orchestrator | 2025-09-27 03:41:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:44.213195 | orchestrator | 2025-09-27 03:41:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:44.213229 | orchestrator | 2025-09-27 03:41:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:47.267297 | orchestrator | 2025-09-27 03:41:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:47.270148 | orchestrator | 2025-09-27 03:41:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:47.270437 | orchestrator | 2025-09-27 03:41:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:50.318346 | orchestrator | 2025-09-27 03:41:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:50.318931 | orchestrator | 2025-09-27 03:41:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:50.319205 | orchestrator | 2025-09-27 03:41:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:53.364022 | orchestrator | 2025-09-27 03:41:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:53.365395 | orchestrator | 2025-09-27 03:41:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:53.365427 | orchestrator | 2025-09-27 03:41:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:56.409048 | orchestrator | 2025-09-27 03:41:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:56.410369 | orchestrator | 2025-09-27 03:41:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:56.410398 | orchestrator | 2025-09-27 03:41:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:41:59.457820 | orchestrator | 2025-09-27 03:41:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:41:59.458300 | orchestrator | 2025-09-27 03:41:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:41:59.458332 | orchestrator | 2025-09-27 03:41:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:02.504524 | orchestrator | 2025-09-27 03:42:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:02.505309 | orchestrator | 2025-09-27 03:42:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:02.505338 | orchestrator | 2025-09-27 03:42:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:05.548487 | orchestrator | 2025-09-27 03:42:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:05.549946 | orchestrator | 2025-09-27 03:42:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:05.549974 | orchestrator | 2025-09-27 03:42:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:08.600113 | orchestrator | 2025-09-27 03:42:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:08.602009 | orchestrator | 2025-09-27 03:42:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:08.602095 | orchestrator | 2025-09-27 03:42:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:11.648305 | orchestrator | 2025-09-27 03:42:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:11.649735 | orchestrator | 2025-09-27 03:42:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:11.649877 | orchestrator | 2025-09-27 03:42:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:14.696828 | orchestrator | 2025-09-27 03:42:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:14.698382 | orchestrator | 2025-09-27 03:42:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:14.698426 | orchestrator | 2025-09-27 03:42:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:17.748643 | orchestrator | 2025-09-27 03:42:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:17.749598 | orchestrator | 2025-09-27 03:42:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:17.749671 | orchestrator | 2025-09-27 03:42:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:20.799472 | orchestrator | 2025-09-27 03:42:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:20.803203 | orchestrator | 2025-09-27 03:42:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:20.803234 | orchestrator | 2025-09-27 03:42:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:23.850167 | orchestrator | 2025-09-27 03:42:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:23.851821 | orchestrator | 2025-09-27 03:42:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:23.851998 | orchestrator | 2025-09-27 03:42:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:26.893153 | orchestrator | 2025-09-27 03:42:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:26.894615 | orchestrator | 2025-09-27 03:42:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:26.894644 | orchestrator | 2025-09-27 03:42:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:29.940296 | orchestrator | 2025-09-27 03:42:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:29.941501 | orchestrator | 2025-09-27 03:42:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:29.941530 | orchestrator | 2025-09-27 03:42:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:32.982212 | orchestrator | 2025-09-27 03:42:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:32.984529 | orchestrator | 2025-09-27 03:42:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:32.984556 | orchestrator | 2025-09-27 03:42:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:36.034801 | orchestrator | 2025-09-27 03:42:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:36.036185 | orchestrator | 2025-09-27 03:42:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:36.036229 | orchestrator | 2025-09-27 03:42:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:39.082271 | orchestrator | 2025-09-27 03:42:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:39.084750 | orchestrator | 2025-09-27 03:42:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:39.084808 | orchestrator | 2025-09-27 03:42:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:42.136372 | orchestrator | 2025-09-27 03:42:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:42.139765 | orchestrator | 2025-09-27 03:42:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:42.140112 | orchestrator | 2025-09-27 03:42:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:45.183902 | orchestrator | 2025-09-27 03:42:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:45.186119 | orchestrator | 2025-09-27 03:42:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:45.186155 | orchestrator | 2025-09-27 03:42:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:48.231998 | orchestrator | 2025-09-27 03:42:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:48.233146 | orchestrator | 2025-09-27 03:42:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:48.233215 | orchestrator | 2025-09-27 03:42:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:51.277554 | orchestrator | 2025-09-27 03:42:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:51.278978 | orchestrator | 2025-09-27 03:42:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:51.279041 | orchestrator | 2025-09-27 03:42:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:54.330354 | orchestrator | 2025-09-27 03:42:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:54.331349 | orchestrator | 2025-09-27 03:42:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:54.331637 | orchestrator | 2025-09-27 03:42:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:42:57.375580 | orchestrator | 2025-09-27 03:42:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:42:57.376512 | orchestrator | 2025-09-27 03:42:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:42:57.376733 | orchestrator | 2025-09-27 03:42:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:00.416881 | orchestrator | 2025-09-27 03:43:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:00.418000 | orchestrator | 2025-09-27 03:43:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:00.418099 | orchestrator | 2025-09-27 03:43:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:03.467268 | orchestrator | 2025-09-27 03:43:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:03.468522 | orchestrator | 2025-09-27 03:43:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:03.468554 | orchestrator | 2025-09-27 03:43:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:06.507938 | orchestrator | 2025-09-27 03:43:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:06.509468 | orchestrator | 2025-09-27 03:43:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:06.509497 | orchestrator | 2025-09-27 03:43:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:09.557066 | orchestrator | 2025-09-27 03:43:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:09.557993 | orchestrator | 2025-09-27 03:43:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:09.558135 | orchestrator | 2025-09-27 03:43:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:12.604301 | orchestrator | 2025-09-27 03:43:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:12.606220 | orchestrator | 2025-09-27 03:43:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:12.606515 | orchestrator | 2025-09-27 03:43:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:15.655933 | orchestrator | 2025-09-27 03:43:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:15.657563 | orchestrator | 2025-09-27 03:43:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:15.657961 | orchestrator | 2025-09-27 03:43:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:18.702074 | orchestrator | 2025-09-27 03:43:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:18.703329 | orchestrator | 2025-09-27 03:43:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:18.703378 | orchestrator | 2025-09-27 03:43:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:21.742814 | orchestrator | 2025-09-27 03:43:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:21.744620 | orchestrator | 2025-09-27 03:43:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:21.744686 | orchestrator | 2025-09-27 03:43:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:24.790986 | orchestrator | 2025-09-27 03:43:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:24.793720 | orchestrator | 2025-09-27 03:43:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:24.793774 | orchestrator | 2025-09-27 03:43:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:27.833692 | orchestrator | 2025-09-27 03:43:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:27.835463 | orchestrator | 2025-09-27 03:43:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:27.835491 | orchestrator | 2025-09-27 03:43:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:30.872542 | orchestrator | 2025-09-27 03:43:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:30.874470 | orchestrator | 2025-09-27 03:43:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:30.874514 | orchestrator | 2025-09-27 03:43:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:33.923927 | orchestrator | 2025-09-27 03:43:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:33.925431 | orchestrator | 2025-09-27 03:43:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:33.925853 | orchestrator | 2025-09-27 03:43:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:36.971446 | orchestrator | 2025-09-27 03:43:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:36.972925 | orchestrator | 2025-09-27 03:43:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:36.973236 | orchestrator | 2025-09-27 03:43:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:40.021755 | orchestrator | 2025-09-27 03:43:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:40.023265 | orchestrator | 2025-09-27 03:43:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:40.023343 | orchestrator | 2025-09-27 03:43:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:43.072811 | orchestrator | 2025-09-27 03:43:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:43.073968 | orchestrator | 2025-09-27 03:43:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:43.073995 | orchestrator | 2025-09-27 03:43:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:46.122479 | orchestrator | 2025-09-27 03:43:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:46.122772 | orchestrator | 2025-09-27 03:43:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:46.122799 | orchestrator | 2025-09-27 03:43:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:49.175329 | orchestrator | 2025-09-27 03:43:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:49.176862 | orchestrator | 2025-09-27 03:43:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:49.176886 | orchestrator | 2025-09-27 03:43:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:52.218726 | orchestrator | 2025-09-27 03:43:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:52.219228 | orchestrator | 2025-09-27 03:43:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:52.219299 | orchestrator | 2025-09-27 03:43:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:55.265390 | orchestrator | 2025-09-27 03:43:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:55.266000 | orchestrator | 2025-09-27 03:43:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:55.266110 | orchestrator | 2025-09-27 03:43:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:43:58.310108 | orchestrator | 2025-09-27 03:43:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:43:58.311798 | orchestrator | 2025-09-27 03:43:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:43:58.311820 | orchestrator | 2025-09-27 03:43:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:01.351064 | orchestrator | 2025-09-27 03:44:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:01.351835 | orchestrator | 2025-09-27 03:44:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:01.351864 | orchestrator | 2025-09-27 03:44:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:04.405485 | orchestrator | 2025-09-27 03:44:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:04.406521 | orchestrator | 2025-09-27 03:44:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:04.406549 | orchestrator | 2025-09-27 03:44:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:07.453017 | orchestrator | 2025-09-27 03:44:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:07.456527 | orchestrator | 2025-09-27 03:44:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:07.456679 | orchestrator | 2025-09-27 03:44:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:10.504482 | orchestrator | 2025-09-27 03:44:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:10.507024 | orchestrator | 2025-09-27 03:44:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:10.507323 | orchestrator | 2025-09-27 03:44:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:13.552503 | orchestrator | 2025-09-27 03:44:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:13.554492 | orchestrator | 2025-09-27 03:44:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:13.554575 | orchestrator | 2025-09-27 03:44:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:16.597808 | orchestrator | 2025-09-27 03:44:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:16.598453 | orchestrator | 2025-09-27 03:44:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:16.598610 | orchestrator | 2025-09-27 03:44:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:19.646180 | orchestrator | 2025-09-27 03:44:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:19.648336 | orchestrator | 2025-09-27 03:44:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:19.648365 | orchestrator | 2025-09-27 03:44:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:22.698134 | orchestrator | 2025-09-27 03:44:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:22.700956 | orchestrator | 2025-09-27 03:44:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:22.701408 | orchestrator | 2025-09-27 03:44:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:25.743754 | orchestrator | 2025-09-27 03:44:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:25.744467 | orchestrator | 2025-09-27 03:44:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:25.744500 | orchestrator | 2025-09-27 03:44:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:28.788459 | orchestrator | 2025-09-27 03:44:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:28.789969 | orchestrator | 2025-09-27 03:44:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:28.790000 | orchestrator | 2025-09-27 03:44:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:31.836710 | orchestrator | 2025-09-27 03:44:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:31.837712 | orchestrator | 2025-09-27 03:44:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:31.837965 | orchestrator | 2025-09-27 03:44:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:34.883216 | orchestrator | 2025-09-27 03:44:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:34.884526 | orchestrator | 2025-09-27 03:44:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:34.884702 | orchestrator | 2025-09-27 03:44:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:37.932498 | orchestrator | 2025-09-27 03:44:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:37.934377 | orchestrator | 2025-09-27 03:44:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:37.934676 | orchestrator | 2025-09-27 03:44:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:40.980421 | orchestrator | 2025-09-27 03:44:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:40.981718 | orchestrator | 2025-09-27 03:44:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:40.981747 | orchestrator | 2025-09-27 03:44:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:44.024529 | orchestrator | 2025-09-27 03:44:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:44.025626 | orchestrator | 2025-09-27 03:44:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:44.025989 | orchestrator | 2025-09-27 03:44:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:47.065223 | orchestrator | 2025-09-27 03:44:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:47.067514 | orchestrator | 2025-09-27 03:44:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:47.067887 | orchestrator | 2025-09-27 03:44:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:50.110127 | orchestrator | 2025-09-27 03:44:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:50.111271 | orchestrator | 2025-09-27 03:44:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:50.111481 | orchestrator | 2025-09-27 03:44:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:53.163556 | orchestrator | 2025-09-27 03:44:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:53.165366 | orchestrator | 2025-09-27 03:44:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:53.165396 | orchestrator | 2025-09-27 03:44:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:56.211993 | orchestrator | 2025-09-27 03:44:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:56.215183 | orchestrator | 2025-09-27 03:44:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:56.215258 | orchestrator | 2025-09-27 03:44:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:44:59.261047 | orchestrator | 2025-09-27 03:44:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:44:59.263228 | orchestrator | 2025-09-27 03:44:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:44:59.263422 | orchestrator | 2025-09-27 03:44:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:02.317002 | orchestrator | 2025-09-27 03:45:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:02.318842 | orchestrator | 2025-09-27 03:45:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:02.318936 | orchestrator | 2025-09-27 03:45:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:05.368776 | orchestrator | 2025-09-27 03:45:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:05.370505 | orchestrator | 2025-09-27 03:45:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:05.370724 | orchestrator | 2025-09-27 03:45:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:08.418484 | orchestrator | 2025-09-27 03:45:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:08.419557 | orchestrator | 2025-09-27 03:45:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:08.419584 | orchestrator | 2025-09-27 03:45:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:11.457694 | orchestrator | 2025-09-27 03:45:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:11.459101 | orchestrator | 2025-09-27 03:45:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:11.459154 | orchestrator | 2025-09-27 03:45:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:14.502180 | orchestrator | 2025-09-27 03:45:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:14.503383 | orchestrator | 2025-09-27 03:45:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:14.503413 | orchestrator | 2025-09-27 03:45:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:17.549428 | orchestrator | 2025-09-27 03:45:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:17.552532 | orchestrator | 2025-09-27 03:45:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:17.552749 | orchestrator | 2025-09-27 03:45:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:20.599681 | orchestrator | 2025-09-27 03:45:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:20.600588 | orchestrator | 2025-09-27 03:45:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:20.601095 | orchestrator | 2025-09-27 03:45:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:23.644316 | orchestrator | 2025-09-27 03:45:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:23.645520 | orchestrator | 2025-09-27 03:45:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:23.645554 | orchestrator | 2025-09-27 03:45:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:26.688705 | orchestrator | 2025-09-27 03:45:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:26.690208 | orchestrator | 2025-09-27 03:45:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:26.690289 | orchestrator | 2025-09-27 03:45:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:29.737739 | orchestrator | 2025-09-27 03:45:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:29.739465 | orchestrator | 2025-09-27 03:45:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:29.739645 | orchestrator | 2025-09-27 03:45:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:32.786691 | orchestrator | 2025-09-27 03:45:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:32.788163 | orchestrator | 2025-09-27 03:45:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:32.788191 | orchestrator | 2025-09-27 03:45:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:35.829674 | orchestrator | 2025-09-27 03:45:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:35.831386 | orchestrator | 2025-09-27 03:45:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:35.831445 | orchestrator | 2025-09-27 03:45:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:38.875604 | orchestrator | 2025-09-27 03:45:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:38.876711 | orchestrator | 2025-09-27 03:45:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:38.876750 | orchestrator | 2025-09-27 03:45:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:41.920243 | orchestrator | 2025-09-27 03:45:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:41.921515 | orchestrator | 2025-09-27 03:45:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:41.921551 | orchestrator | 2025-09-27 03:45:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:44.969757 | orchestrator | 2025-09-27 03:45:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:44.972081 | orchestrator | 2025-09-27 03:45:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:44.972107 | orchestrator | 2025-09-27 03:45:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:48.020352 | orchestrator | 2025-09-27 03:45:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:48.021837 | orchestrator | 2025-09-27 03:45:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:48.022104 | orchestrator | 2025-09-27 03:45:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:51.063680 | orchestrator | 2025-09-27 03:45:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:51.065478 | orchestrator | 2025-09-27 03:45:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:51.065490 | orchestrator | 2025-09-27 03:45:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:54.113572 | orchestrator | 2025-09-27 03:45:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:54.115376 | orchestrator | 2025-09-27 03:45:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:54.115414 | orchestrator | 2025-09-27 03:45:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:45:57.155756 | orchestrator | 2025-09-27 03:45:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:45:57.157884 | orchestrator | 2025-09-27 03:45:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:45:57.157978 | orchestrator | 2025-09-27 03:45:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:00.205312 | orchestrator | 2025-09-27 03:46:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:00.207057 | orchestrator | 2025-09-27 03:46:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:00.207088 | orchestrator | 2025-09-27 03:46:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:03.251953 | orchestrator | 2025-09-27 03:46:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:03.253750 | orchestrator | 2025-09-27 03:46:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:03.253768 | orchestrator | 2025-09-27 03:46:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:06.301000 | orchestrator | 2025-09-27 03:46:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:06.302503 | orchestrator | 2025-09-27 03:46:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:06.302576 | orchestrator | 2025-09-27 03:46:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:09.346472 | orchestrator | 2025-09-27 03:46:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:09.347754 | orchestrator | 2025-09-27 03:46:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:09.348171 | orchestrator | 2025-09-27 03:46:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:12.393629 | orchestrator | 2025-09-27 03:46:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:12.394686 | orchestrator | 2025-09-27 03:46:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:12.394715 | orchestrator | 2025-09-27 03:46:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:15.442937 | orchestrator | 2025-09-27 03:46:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:15.443875 | orchestrator | 2025-09-27 03:46:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:15.444093 | orchestrator | 2025-09-27 03:46:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:18.482757 | orchestrator | 2025-09-27 03:46:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:18.484128 | orchestrator | 2025-09-27 03:46:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:18.484346 | orchestrator | 2025-09-27 03:46:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:21.529703 | orchestrator | 2025-09-27 03:46:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:21.531323 | orchestrator | 2025-09-27 03:46:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:21.531354 | orchestrator | 2025-09-27 03:46:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:24.578817 | orchestrator | 2025-09-27 03:46:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:24.581573 | orchestrator | 2025-09-27 03:46:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:24.582486 | orchestrator | 2025-09-27 03:46:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:27.631780 | orchestrator | 2025-09-27 03:46:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:27.633700 | orchestrator | 2025-09-27 03:46:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:27.633774 | orchestrator | 2025-09-27 03:46:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:30.681260 | orchestrator | 2025-09-27 03:46:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:30.682876 | orchestrator | 2025-09-27 03:46:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:30.725802 | orchestrator | 2025-09-27 03:46:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:33.728070 | orchestrator | 2025-09-27 03:46:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:33.729433 | orchestrator | 2025-09-27 03:46:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:33.729840 | orchestrator | 2025-09-27 03:46:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:36.778518 | orchestrator | 2025-09-27 03:46:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:36.780849 | orchestrator | 2025-09-27 03:46:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:36.780977 | orchestrator | 2025-09-27 03:46:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:39.829772 | orchestrator | 2025-09-27 03:46:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:39.831802 | orchestrator | 2025-09-27 03:46:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:39.831830 | orchestrator | 2025-09-27 03:46:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:42.885907 | orchestrator | 2025-09-27 03:46:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:42.887216 | orchestrator | 2025-09-27 03:46:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:42.887723 | orchestrator | 2025-09-27 03:46:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:45.930128 | orchestrator | 2025-09-27 03:46:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:45.931683 | orchestrator | 2025-09-27 03:46:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:45.932111 | orchestrator | 2025-09-27 03:46:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:48.974959 | orchestrator | 2025-09-27 03:46:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:48.975857 | orchestrator | 2025-09-27 03:46:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:48.975885 | orchestrator | 2025-09-27 03:46:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:52.024531 | orchestrator | 2025-09-27 03:46:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:52.025694 | orchestrator | 2025-09-27 03:46:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:52.025737 | orchestrator | 2025-09-27 03:46:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:55.069830 | orchestrator | 2025-09-27 03:46:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:55.070409 | orchestrator | 2025-09-27 03:46:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:55.070440 | orchestrator | 2025-09-27 03:46:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:46:58.117066 | orchestrator | 2025-09-27 03:46:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:46:58.119298 | orchestrator | 2025-09-27 03:46:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:46:58.119370 | orchestrator | 2025-09-27 03:46:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:01.164782 | orchestrator | 2025-09-27 03:47:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:01.166350 | orchestrator | 2025-09-27 03:47:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:01.166402 | orchestrator | 2025-09-27 03:47:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:04.216874 | orchestrator | 2025-09-27 03:47:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:04.218092 | orchestrator | 2025-09-27 03:47:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:04.218154 | orchestrator | 2025-09-27 03:47:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:07.271444 | orchestrator | 2025-09-27 03:47:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:07.272806 | orchestrator | 2025-09-27 03:47:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:07.272839 | orchestrator | 2025-09-27 03:47:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:10.321014 | orchestrator | 2025-09-27 03:47:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:10.322621 | orchestrator | 2025-09-27 03:47:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:10.322851 | orchestrator | 2025-09-27 03:47:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:13.378313 | orchestrator | 2025-09-27 03:47:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:13.380043 | orchestrator | 2025-09-27 03:47:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:13.380290 | orchestrator | 2025-09-27 03:47:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:16.429996 | orchestrator | 2025-09-27 03:47:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:16.431696 | orchestrator | 2025-09-27 03:47:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:16.431826 | orchestrator | 2025-09-27 03:47:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:19.478699 | orchestrator | 2025-09-27 03:47:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:19.481277 | orchestrator | 2025-09-27 03:47:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:19.481608 | orchestrator | 2025-09-27 03:47:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:22.524668 | orchestrator | 2025-09-27 03:47:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:22.526797 | orchestrator | 2025-09-27 03:47:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:22.527006 | orchestrator | 2025-09-27 03:47:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:25.567754 | orchestrator | 2025-09-27 03:47:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:25.569154 | orchestrator | 2025-09-27 03:47:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:25.569182 | orchestrator | 2025-09-27 03:47:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:28.609046 | orchestrator | 2025-09-27 03:47:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:28.610128 | orchestrator | 2025-09-27 03:47:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:28.610155 | orchestrator | 2025-09-27 03:47:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:31.659650 | orchestrator | 2025-09-27 03:47:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:31.662448 | orchestrator | 2025-09-27 03:47:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:31.662477 | orchestrator | 2025-09-27 03:47:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:34.708275 | orchestrator | 2025-09-27 03:47:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:34.709717 | orchestrator | 2025-09-27 03:47:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:34.710110 | orchestrator | 2025-09-27 03:47:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:37.761269 | orchestrator | 2025-09-27 03:47:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:37.763344 | orchestrator | 2025-09-27 03:47:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:37.763449 | orchestrator | 2025-09-27 03:47:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:40.815895 | orchestrator | 2025-09-27 03:47:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:40.817863 | orchestrator | 2025-09-27 03:47:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:40.817891 | orchestrator | 2025-09-27 03:47:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:43.869949 | orchestrator | 2025-09-27 03:47:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:43.871518 | orchestrator | 2025-09-27 03:47:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:43.871562 | orchestrator | 2025-09-27 03:47:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:46.918781 | orchestrator | 2025-09-27 03:47:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:46.920261 | orchestrator | 2025-09-27 03:47:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:46.920289 | orchestrator | 2025-09-27 03:47:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:49.965311 | orchestrator | 2025-09-27 03:47:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:49.967393 | orchestrator | 2025-09-27 03:47:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:49.967423 | orchestrator | 2025-09-27 03:47:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:53.017447 | orchestrator | 2025-09-27 03:47:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:53.018702 | orchestrator | 2025-09-27 03:47:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:53.018857 | orchestrator | 2025-09-27 03:47:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:56.056423 | orchestrator | 2025-09-27 03:47:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:56.057887 | orchestrator | 2025-09-27 03:47:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:56.057917 | orchestrator | 2025-09-27 03:47:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:47:59.107417 | orchestrator | 2025-09-27 03:47:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:47:59.109285 | orchestrator | 2025-09-27 03:47:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:47:59.109318 | orchestrator | 2025-09-27 03:47:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:02.152930 | orchestrator | 2025-09-27 03:48:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:02.153493 | orchestrator | 2025-09-27 03:48:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:02.153692 | orchestrator | 2025-09-27 03:48:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:05.198853 | orchestrator | 2025-09-27 03:48:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:05.199831 | orchestrator | 2025-09-27 03:48:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:05.199859 | orchestrator | 2025-09-27 03:48:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:08.246439 | orchestrator | 2025-09-27 03:48:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:08.246543 | orchestrator | 2025-09-27 03:48:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:08.246559 | orchestrator | 2025-09-27 03:48:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:11.292844 | orchestrator | 2025-09-27 03:48:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:11.294868 | orchestrator | 2025-09-27 03:48:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:11.294895 | orchestrator | 2025-09-27 03:48:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:14.344854 | orchestrator | 2025-09-27 03:48:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:14.346563 | orchestrator | 2025-09-27 03:48:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:14.346783 | orchestrator | 2025-09-27 03:48:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:17.397349 | orchestrator | 2025-09-27 03:48:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:17.399644 | orchestrator | 2025-09-27 03:48:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:17.399674 | orchestrator | 2025-09-27 03:48:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:20.446573 | orchestrator | 2025-09-27 03:48:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:20.449087 | orchestrator | 2025-09-27 03:48:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:20.449158 | orchestrator | 2025-09-27 03:48:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:23.493955 | orchestrator | 2025-09-27 03:48:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:23.495877 | orchestrator | 2025-09-27 03:48:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:23.496148 | orchestrator | 2025-09-27 03:48:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:26.537498 | orchestrator | 2025-09-27 03:48:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:26.539086 | orchestrator | 2025-09-27 03:48:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:26.539115 | orchestrator | 2025-09-27 03:48:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:29.591924 | orchestrator | 2025-09-27 03:48:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:29.593578 | orchestrator | 2025-09-27 03:48:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:29.593758 | orchestrator | 2025-09-27 03:48:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:32.638931 | orchestrator | 2025-09-27 03:48:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:32.639914 | orchestrator | 2025-09-27 03:48:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:32.639971 | orchestrator | 2025-09-27 03:48:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:35.687865 | orchestrator | 2025-09-27 03:48:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:35.689156 | orchestrator | 2025-09-27 03:48:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:35.689192 | orchestrator | 2025-09-27 03:48:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:38.736526 | orchestrator | 2025-09-27 03:48:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:38.737582 | orchestrator | 2025-09-27 03:48:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:38.737667 | orchestrator | 2025-09-27 03:48:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:41.779301 | orchestrator | 2025-09-27 03:48:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:41.780975 | orchestrator | 2025-09-27 03:48:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:41.781010 | orchestrator | 2025-09-27 03:48:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:44.828311 | orchestrator | 2025-09-27 03:48:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:44.829839 | orchestrator | 2025-09-27 03:48:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:44.829873 | orchestrator | 2025-09-27 03:48:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:47.873905 | orchestrator | 2025-09-27 03:48:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:47.876257 | orchestrator | 2025-09-27 03:48:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:47.876291 | orchestrator | 2025-09-27 03:48:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:50.919471 | orchestrator | 2025-09-27 03:48:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:50.920604 | orchestrator | 2025-09-27 03:48:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:50.920677 | orchestrator | 2025-09-27 03:48:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:53.975412 | orchestrator | 2025-09-27 03:48:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:53.976634 | orchestrator | 2025-09-27 03:48:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:53.976877 | orchestrator | 2025-09-27 03:48:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:48:57.025546 | orchestrator | 2025-09-27 03:48:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:48:57.027139 | orchestrator | 2025-09-27 03:48:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:48:57.027174 | orchestrator | 2025-09-27 03:48:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:00.070194 | orchestrator | 2025-09-27 03:49:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:00.071121 | orchestrator | 2025-09-27 03:49:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:00.071275 | orchestrator | 2025-09-27 03:49:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:03.116227 | orchestrator | 2025-09-27 03:49:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:03.117648 | orchestrator | 2025-09-27 03:49:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:03.117708 | orchestrator | 2025-09-27 03:49:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:06.160493 | orchestrator | 2025-09-27 03:49:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:06.161733 | orchestrator | 2025-09-27 03:49:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:06.161760 | orchestrator | 2025-09-27 03:49:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:09.204101 | orchestrator | 2025-09-27 03:49:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:09.206138 | orchestrator | 2025-09-27 03:49:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:09.206267 | orchestrator | 2025-09-27 03:49:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:12.253178 | orchestrator | 2025-09-27 03:49:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:12.255635 | orchestrator | 2025-09-27 03:49:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:12.255835 | orchestrator | 2025-09-27 03:49:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:15.311003 | orchestrator | 2025-09-27 03:49:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:15.311850 | orchestrator | 2025-09-27 03:49:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:15.311922 | orchestrator | 2025-09-27 03:49:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:18.366685 | orchestrator | 2025-09-27 03:49:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:18.368716 | orchestrator | 2025-09-27 03:49:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:18.368742 | orchestrator | 2025-09-27 03:49:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:21.419227 | orchestrator | 2025-09-27 03:49:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:21.422231 | orchestrator | 2025-09-27 03:49:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:21.422259 | orchestrator | 2025-09-27 03:49:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:24.470951 | orchestrator | 2025-09-27 03:49:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:24.474461 | orchestrator | 2025-09-27 03:49:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:24.474493 | orchestrator | 2025-09-27 03:49:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:27.520349 | orchestrator | 2025-09-27 03:49:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:27.522205 | orchestrator | 2025-09-27 03:49:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:27.522528 | orchestrator | 2025-09-27 03:49:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:30.574912 | orchestrator | 2025-09-27 03:49:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:30.576085 | orchestrator | 2025-09-27 03:49:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:30.576113 | orchestrator | 2025-09-27 03:49:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:33.619559 | orchestrator | 2025-09-27 03:49:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:33.622125 | orchestrator | 2025-09-27 03:49:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:33.622589 | orchestrator | 2025-09-27 03:49:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:36.668949 | orchestrator | 2025-09-27 03:49:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:36.670122 | orchestrator | 2025-09-27 03:49:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:36.670479 | orchestrator | 2025-09-27 03:49:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:39.716984 | orchestrator | 2025-09-27 03:49:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:39.718325 | orchestrator | 2025-09-27 03:49:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:39.718539 | orchestrator | 2025-09-27 03:49:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:42.767032 | orchestrator | 2025-09-27 03:49:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:42.769552 | orchestrator | 2025-09-27 03:49:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:42.769587 | orchestrator | 2025-09-27 03:49:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:45.812517 | orchestrator | 2025-09-27 03:49:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:45.814277 | orchestrator | 2025-09-27 03:49:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:45.814385 | orchestrator | 2025-09-27 03:49:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:48.859919 | orchestrator | 2025-09-27 03:49:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:48.860800 | orchestrator | 2025-09-27 03:49:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:48.861429 | orchestrator | 2025-09-27 03:49:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:51.908916 | orchestrator | 2025-09-27 03:49:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:51.910814 | orchestrator | 2025-09-27 03:49:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:51.910897 | orchestrator | 2025-09-27 03:49:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:54.947552 | orchestrator | 2025-09-27 03:49:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:54.948752 | orchestrator | 2025-09-27 03:49:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:54.948783 | orchestrator | 2025-09-27 03:49:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:49:57.988245 | orchestrator | 2025-09-27 03:49:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:49:57.990129 | orchestrator | 2025-09-27 03:49:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:49:57.990175 | orchestrator | 2025-09-27 03:49:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:01.042317 | orchestrator | 2025-09-27 03:50:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:01.042990 | orchestrator | 2025-09-27 03:50:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:01.043022 | orchestrator | 2025-09-27 03:50:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:04.085734 | orchestrator | 2025-09-27 03:50:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:04.086309 | orchestrator | 2025-09-27 03:50:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:04.086613 | orchestrator | 2025-09-27 03:50:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:07.131302 | orchestrator | 2025-09-27 03:50:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:07.132858 | orchestrator | 2025-09-27 03:50:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:07.132903 | orchestrator | 2025-09-27 03:50:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:10.174332 | orchestrator | 2025-09-27 03:50:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:10.175917 | orchestrator | 2025-09-27 03:50:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:10.176197 | orchestrator | 2025-09-27 03:50:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:13.222517 | orchestrator | 2025-09-27 03:50:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:13.225711 | orchestrator | 2025-09-27 03:50:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:13.225782 | orchestrator | 2025-09-27 03:50:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:16.266988 | orchestrator | 2025-09-27 03:50:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:16.269149 | orchestrator | 2025-09-27 03:50:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:16.269178 | orchestrator | 2025-09-27 03:50:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:19.315027 | orchestrator | 2025-09-27 03:50:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:19.316373 | orchestrator | 2025-09-27 03:50:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:19.316604 | orchestrator | 2025-09-27 03:50:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:22.359977 | orchestrator | 2025-09-27 03:50:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:22.361280 | orchestrator | 2025-09-27 03:50:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:22.361315 | orchestrator | 2025-09-27 03:50:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:25.403137 | orchestrator | 2025-09-27 03:50:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:25.405062 | orchestrator | 2025-09-27 03:50:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:25.405337 | orchestrator | 2025-09-27 03:50:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:28.451820 | orchestrator | 2025-09-27 03:50:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:28.453097 | orchestrator | 2025-09-27 03:50:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:28.453170 | orchestrator | 2025-09-27 03:50:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:31.497381 | orchestrator | 2025-09-27 03:50:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:31.498634 | orchestrator | 2025-09-27 03:50:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:31.498694 | orchestrator | 2025-09-27 03:50:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:34.545052 | orchestrator | 2025-09-27 03:50:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:34.546721 | orchestrator | 2025-09-27 03:50:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:34.546809 | orchestrator | 2025-09-27 03:50:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:37.591408 | orchestrator | 2025-09-27 03:50:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:37.594188 | orchestrator | 2025-09-27 03:50:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:37.594294 | orchestrator | 2025-09-27 03:50:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:40.639739 | orchestrator | 2025-09-27 03:50:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:40.641602 | orchestrator | 2025-09-27 03:50:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:40.641659 | orchestrator | 2025-09-27 03:50:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:43.682871 | orchestrator | 2025-09-27 03:50:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:43.683979 | orchestrator | 2025-09-27 03:50:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:43.684009 | orchestrator | 2025-09-27 03:50:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:46.725257 | orchestrator | 2025-09-27 03:50:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:46.728317 | orchestrator | 2025-09-27 03:50:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:46.728373 | orchestrator | 2025-09-27 03:50:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:49.775237 | orchestrator | 2025-09-27 03:50:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:49.777249 | orchestrator | 2025-09-27 03:50:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:49.777279 | orchestrator | 2025-09-27 03:50:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:52.827165 | orchestrator | 2025-09-27 03:50:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:52.829001 | orchestrator | 2025-09-27 03:50:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:52.829037 | orchestrator | 2025-09-27 03:50:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:55.873339 | orchestrator | 2025-09-27 03:50:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:55.875719 | orchestrator | 2025-09-27 03:50:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:55.875756 | orchestrator | 2025-09-27 03:50:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:50:58.919692 | orchestrator | 2025-09-27 03:50:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:50:58.920610 | orchestrator | 2025-09-27 03:50:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:50:58.920639 | orchestrator | 2025-09-27 03:50:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:01.971035 | orchestrator | 2025-09-27 03:51:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:01.972727 | orchestrator | 2025-09-27 03:51:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:01.972809 | orchestrator | 2025-09-27 03:51:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:05.026150 | orchestrator | 2025-09-27 03:51:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:05.027403 | orchestrator | 2025-09-27 03:51:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:05.027661 | orchestrator | 2025-09-27 03:51:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:08.073066 | orchestrator | 2025-09-27 03:51:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:08.075338 | orchestrator | 2025-09-27 03:51:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:08.075365 | orchestrator | 2025-09-27 03:51:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:11.119284 | orchestrator | 2025-09-27 03:51:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:11.120649 | orchestrator | 2025-09-27 03:51:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:11.120678 | orchestrator | 2025-09-27 03:51:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:14.161526 | orchestrator | 2025-09-27 03:51:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:14.163214 | orchestrator | 2025-09-27 03:51:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:14.163315 | orchestrator | 2025-09-27 03:51:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:17.208261 | orchestrator | 2025-09-27 03:51:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:17.209748 | orchestrator | 2025-09-27 03:51:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:17.210102 | orchestrator | 2025-09-27 03:51:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:20.253308 | orchestrator | 2025-09-27 03:51:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:20.254120 | orchestrator | 2025-09-27 03:51:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:20.254155 | orchestrator | 2025-09-27 03:51:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:23.299029 | orchestrator | 2025-09-27 03:51:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:23.300703 | orchestrator | 2025-09-27 03:51:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:23.300732 | orchestrator | 2025-09-27 03:51:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:26.342679 | orchestrator | 2025-09-27 03:51:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:26.344195 | orchestrator | 2025-09-27 03:51:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:26.344232 | orchestrator | 2025-09-27 03:51:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:29.387872 | orchestrator | 2025-09-27 03:51:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:29.389183 | orchestrator | 2025-09-27 03:51:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:29.389267 | orchestrator | 2025-09-27 03:51:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:32.430977 | orchestrator | 2025-09-27 03:51:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:32.432692 | orchestrator | 2025-09-27 03:51:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:32.432768 | orchestrator | 2025-09-27 03:51:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:35.478827 | orchestrator | 2025-09-27 03:51:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:35.483206 | orchestrator | 2025-09-27 03:51:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:35.483239 | orchestrator | 2025-09-27 03:51:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:38.532152 | orchestrator | 2025-09-27 03:51:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:38.534787 | orchestrator | 2025-09-27 03:51:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:38.534821 | orchestrator | 2025-09-27 03:51:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:41.581566 | orchestrator | 2025-09-27 03:51:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:41.583089 | orchestrator | 2025-09-27 03:51:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:41.583123 | orchestrator | 2025-09-27 03:51:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:44.630487 | orchestrator | 2025-09-27 03:51:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:44.631882 | orchestrator | 2025-09-27 03:51:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:44.631915 | orchestrator | 2025-09-27 03:51:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:47.677335 | orchestrator | 2025-09-27 03:51:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:47.678797 | orchestrator | 2025-09-27 03:51:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:47.678819 | orchestrator | 2025-09-27 03:51:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:50.723401 | orchestrator | 2025-09-27 03:51:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:50.725360 | orchestrator | 2025-09-27 03:51:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:50.725396 | orchestrator | 2025-09-27 03:51:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:53.767240 | orchestrator | 2025-09-27 03:51:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:53.768700 | orchestrator | 2025-09-27 03:51:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:53.768773 | orchestrator | 2025-09-27 03:51:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:56.815769 | orchestrator | 2025-09-27 03:51:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:56.817904 | orchestrator | 2025-09-27 03:51:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:56.817933 | orchestrator | 2025-09-27 03:51:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:51:59.863159 | orchestrator | 2025-09-27 03:51:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:51:59.864516 | orchestrator | 2025-09-27 03:51:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:51:59.864630 | orchestrator | 2025-09-27 03:51:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:02.912209 | orchestrator | 2025-09-27 03:52:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:02.914010 | orchestrator | 2025-09-27 03:52:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:02.914152 | orchestrator | 2025-09-27 03:52:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:05.958346 | orchestrator | 2025-09-27 03:52:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:05.959828 | orchestrator | 2025-09-27 03:52:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:05.960008 | orchestrator | 2025-09-27 03:52:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:09.008086 | orchestrator | 2025-09-27 03:52:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:09.010332 | orchestrator | 2025-09-27 03:52:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:09.010414 | orchestrator | 2025-09-27 03:52:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:12.060022 | orchestrator | 2025-09-27 03:52:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:12.062371 | orchestrator | 2025-09-27 03:52:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:12.062463 | orchestrator | 2025-09-27 03:52:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:15.111084 | orchestrator | 2025-09-27 03:52:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:15.112462 | orchestrator | 2025-09-27 03:52:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:15.112504 | orchestrator | 2025-09-27 03:52:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:18.157636 | orchestrator | 2025-09-27 03:52:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:18.159388 | orchestrator | 2025-09-27 03:52:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:18.159498 | orchestrator | 2025-09-27 03:52:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:21.204851 | orchestrator | 2025-09-27 03:52:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:21.206842 | orchestrator | 2025-09-27 03:52:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:21.206953 | orchestrator | 2025-09-27 03:52:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:24.253050 | orchestrator | 2025-09-27 03:52:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:24.254433 | orchestrator | 2025-09-27 03:52:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:24.254466 | orchestrator | 2025-09-27 03:52:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:27.287197 | orchestrator | 2025-09-27 03:52:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:27.288292 | orchestrator | 2025-09-27 03:52:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:27.288321 | orchestrator | 2025-09-27 03:52:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:30.332054 | orchestrator | 2025-09-27 03:52:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:30.332921 | orchestrator | 2025-09-27 03:52:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:30.332956 | orchestrator | 2025-09-27 03:52:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:33.382770 | orchestrator | 2025-09-27 03:52:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:33.384209 | orchestrator | 2025-09-27 03:52:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:33.384372 | orchestrator | 2025-09-27 03:52:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:36.430582 | orchestrator | 2025-09-27 03:52:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:36.431973 | orchestrator | 2025-09-27 03:52:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:36.432009 | orchestrator | 2025-09-27 03:52:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:39.475978 | orchestrator | 2025-09-27 03:52:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:39.477447 | orchestrator | 2025-09-27 03:52:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:39.477521 | orchestrator | 2025-09-27 03:52:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:42.522186 | orchestrator | 2025-09-27 03:52:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:42.523775 | orchestrator | 2025-09-27 03:52:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:42.524356 | orchestrator | 2025-09-27 03:52:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:45.569492 | orchestrator | 2025-09-27 03:52:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:45.570783 | orchestrator | 2025-09-27 03:52:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:45.570969 | orchestrator | 2025-09-27 03:52:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:48.614120 | orchestrator | 2025-09-27 03:52:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:48.615463 | orchestrator | 2025-09-27 03:52:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:48.615560 | orchestrator | 2025-09-27 03:52:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:51.661181 | orchestrator | 2025-09-27 03:52:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:51.662870 | orchestrator | 2025-09-27 03:52:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:51.662967 | orchestrator | 2025-09-27 03:52:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:54.717107 | orchestrator | 2025-09-27 03:52:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:54.719065 | orchestrator | 2025-09-27 03:52:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:54.719109 | orchestrator | 2025-09-27 03:52:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:52:57.763378 | orchestrator | 2025-09-27 03:52:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:52:57.765238 | orchestrator | 2025-09-27 03:52:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:52:57.765416 | orchestrator | 2025-09-27 03:52:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:00.809833 | orchestrator | 2025-09-27 03:53:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:00.810973 | orchestrator | 2025-09-27 03:53:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:00.811209 | orchestrator | 2025-09-27 03:53:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:03.858964 | orchestrator | 2025-09-27 03:53:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:03.861349 | orchestrator | 2025-09-27 03:53:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:03.861377 | orchestrator | 2025-09-27 03:53:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:06.907818 | orchestrator | 2025-09-27 03:53:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:06.909719 | orchestrator | 2025-09-27 03:53:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:06.909896 | orchestrator | 2025-09-27 03:53:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:09.955094 | orchestrator | 2025-09-27 03:53:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:09.955614 | orchestrator | 2025-09-27 03:53:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:09.955645 | orchestrator | 2025-09-27 03:53:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:13.007217 | orchestrator | 2025-09-27 03:53:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:13.008797 | orchestrator | 2025-09-27 03:53:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:13.008848 | orchestrator | 2025-09-27 03:53:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:16.049536 | orchestrator | 2025-09-27 03:53:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:16.050694 | orchestrator | 2025-09-27 03:53:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:16.050956 | orchestrator | 2025-09-27 03:53:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:19.101509 | orchestrator | 2025-09-27 03:53:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:19.103397 | orchestrator | 2025-09-27 03:53:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:19.103446 | orchestrator | 2025-09-27 03:53:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:22.155180 | orchestrator | 2025-09-27 03:53:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:22.157317 | orchestrator | 2025-09-27 03:53:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:22.157344 | orchestrator | 2025-09-27 03:53:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:25.206323 | orchestrator | 2025-09-27 03:53:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:25.209525 | orchestrator | 2025-09-27 03:53:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:25.209807 | orchestrator | 2025-09-27 03:53:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:28.256741 | orchestrator | 2025-09-27 03:53:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:28.259703 | orchestrator | 2025-09-27 03:53:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:28.259773 | orchestrator | 2025-09-27 03:53:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:31.303391 | orchestrator | 2025-09-27 03:53:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:31.305424 | orchestrator | 2025-09-27 03:53:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:31.305451 | orchestrator | 2025-09-27 03:53:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:34.357310 | orchestrator | 2025-09-27 03:53:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:34.358939 | orchestrator | 2025-09-27 03:53:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:34.358989 | orchestrator | 2025-09-27 03:53:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:37.409454 | orchestrator | 2025-09-27 03:53:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:37.412111 | orchestrator | 2025-09-27 03:53:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:37.412153 | orchestrator | 2025-09-27 03:53:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:40.456588 | orchestrator | 2025-09-27 03:53:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:40.458316 | orchestrator | 2025-09-27 03:53:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:40.458599 | orchestrator | 2025-09-27 03:53:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:43.507620 | orchestrator | 2025-09-27 03:53:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:43.508721 | orchestrator | 2025-09-27 03:53:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:43.508749 | orchestrator | 2025-09-27 03:53:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:46.554882 | orchestrator | 2025-09-27 03:53:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:46.556163 | orchestrator | 2025-09-27 03:53:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:46.556191 | orchestrator | 2025-09-27 03:53:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:49.603169 | orchestrator | 2025-09-27 03:53:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:49.604371 | orchestrator | 2025-09-27 03:53:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:49.604529 | orchestrator | 2025-09-27 03:53:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:52.650107 | orchestrator | 2025-09-27 03:53:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:52.651158 | orchestrator | 2025-09-27 03:53:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:52.651236 | orchestrator | 2025-09-27 03:53:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:55.696377 | orchestrator | 2025-09-27 03:53:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:55.697120 | orchestrator | 2025-09-27 03:53:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:55.697149 | orchestrator | 2025-09-27 03:53:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:53:58.739254 | orchestrator | 2025-09-27 03:53:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:53:58.740112 | orchestrator | 2025-09-27 03:53:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:53:58.740149 | orchestrator | 2025-09-27 03:53:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:01.784923 | orchestrator | 2025-09-27 03:54:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:01.786419 | orchestrator | 2025-09-27 03:54:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:01.786454 | orchestrator | 2025-09-27 03:54:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:04.834089 | orchestrator | 2025-09-27 03:54:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:04.835946 | orchestrator | 2025-09-27 03:54:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:04.835976 | orchestrator | 2025-09-27 03:54:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:07.881594 | orchestrator | 2025-09-27 03:54:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:07.883047 | orchestrator | 2025-09-27 03:54:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:07.883132 | orchestrator | 2025-09-27 03:54:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:10.931445 | orchestrator | 2025-09-27 03:54:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:10.932966 | orchestrator | 2025-09-27 03:54:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:10.933007 | orchestrator | 2025-09-27 03:54:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:13.981726 | orchestrator | 2025-09-27 03:54:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:13.983274 | orchestrator | 2025-09-27 03:54:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:13.983320 | orchestrator | 2025-09-27 03:54:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:17.023146 | orchestrator | 2025-09-27 03:54:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:17.025420 | orchestrator | 2025-09-27 03:54:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:17.025450 | orchestrator | 2025-09-27 03:54:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:20.074546 | orchestrator | 2025-09-27 03:54:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:20.076282 | orchestrator | 2025-09-27 03:54:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:20.076739 | orchestrator | 2025-09-27 03:54:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:23.122102 | orchestrator | 2025-09-27 03:54:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:23.123958 | orchestrator | 2025-09-27 03:54:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:23.123985 | orchestrator | 2025-09-27 03:54:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:26.174994 | orchestrator | 2025-09-27 03:54:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:26.176047 | orchestrator | 2025-09-27 03:54:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:26.176075 | orchestrator | 2025-09-27 03:54:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:29.223151 | orchestrator | 2025-09-27 03:54:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:29.224747 | orchestrator | 2025-09-27 03:54:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:29.225009 | orchestrator | 2025-09-27 03:54:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:32.271033 | orchestrator | 2025-09-27 03:54:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:32.275189 | orchestrator | 2025-09-27 03:54:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:32.275220 | orchestrator | 2025-09-27 03:54:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:35.320700 | orchestrator | 2025-09-27 03:54:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:35.321417 | orchestrator | 2025-09-27 03:54:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:35.321448 | orchestrator | 2025-09-27 03:54:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:38.363917 | orchestrator | 2025-09-27 03:54:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:38.365073 | orchestrator | 2025-09-27 03:54:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:38.365194 | orchestrator | 2025-09-27 03:54:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:41.405848 | orchestrator | 2025-09-27 03:54:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:41.407626 | orchestrator | 2025-09-27 03:54:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:41.407656 | orchestrator | 2025-09-27 03:54:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:44.453757 | orchestrator | 2025-09-27 03:54:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:44.455415 | orchestrator | 2025-09-27 03:54:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:44.455489 | orchestrator | 2025-09-27 03:54:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:47.501144 | orchestrator | 2025-09-27 03:54:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:47.503124 | orchestrator | 2025-09-27 03:54:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:47.503425 | orchestrator | 2025-09-27 03:54:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:50.549608 | orchestrator | 2025-09-27 03:54:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:50.550647 | orchestrator | 2025-09-27 03:54:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:50.550734 | orchestrator | 2025-09-27 03:54:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:53.594444 | orchestrator | 2025-09-27 03:54:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:53.595789 | orchestrator | 2025-09-27 03:54:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:53.595930 | orchestrator | 2025-09-27 03:54:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:56.640764 | orchestrator | 2025-09-27 03:54:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:56.642828 | orchestrator | 2025-09-27 03:54:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:56.643016 | orchestrator | 2025-09-27 03:54:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:54:59.684681 | orchestrator | 2025-09-27 03:54:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:54:59.686314 | orchestrator | 2025-09-27 03:54:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:54:59.686758 | orchestrator | 2025-09-27 03:54:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:02.730667 | orchestrator | 2025-09-27 03:55:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:02.732454 | orchestrator | 2025-09-27 03:55:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:02.732484 | orchestrator | 2025-09-27 03:55:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:05.774507 | orchestrator | 2025-09-27 03:55:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:05.776478 | orchestrator | 2025-09-27 03:55:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:05.776511 | orchestrator | 2025-09-27 03:55:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:08.826253 | orchestrator | 2025-09-27 03:55:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:08.827747 | orchestrator | 2025-09-27 03:55:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:08.827915 | orchestrator | 2025-09-27 03:55:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:11.875276 | orchestrator | 2025-09-27 03:55:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:11.877918 | orchestrator | 2025-09-27 03:55:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:11.877967 | orchestrator | 2025-09-27 03:55:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:14.927873 | orchestrator | 2025-09-27 03:55:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:14.931482 | orchestrator | 2025-09-27 03:55:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:14.931514 | orchestrator | 2025-09-27 03:55:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:17.980640 | orchestrator | 2025-09-27 03:55:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:17.981647 | orchestrator | 2025-09-27 03:55:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:17.981676 | orchestrator | 2025-09-27 03:55:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:21.041644 | orchestrator | 2025-09-27 03:55:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:21.043476 | orchestrator | 2025-09-27 03:55:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:21.043903 | orchestrator | 2025-09-27 03:55:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:24.091713 | orchestrator | 2025-09-27 03:55:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:24.094287 | orchestrator | 2025-09-27 03:55:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:24.095502 | orchestrator | 2025-09-27 03:55:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:27.141486 | orchestrator | 2025-09-27 03:55:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:27.142633 | orchestrator | 2025-09-27 03:55:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:27.142666 | orchestrator | 2025-09-27 03:55:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:30.196186 | orchestrator | 2025-09-27 03:55:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:30.196295 | orchestrator | 2025-09-27 03:55:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:30.196311 | orchestrator | 2025-09-27 03:55:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:33.241220 | orchestrator | 2025-09-27 03:55:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:33.242386 | orchestrator | 2025-09-27 03:55:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:33.242415 | orchestrator | 2025-09-27 03:55:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:36.293811 | orchestrator | 2025-09-27 03:55:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:36.294573 | orchestrator | 2025-09-27 03:55:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:36.294712 | orchestrator | 2025-09-27 03:55:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:39.344634 | orchestrator | 2025-09-27 03:55:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:39.345695 | orchestrator | 2025-09-27 03:55:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:39.346208 | orchestrator | 2025-09-27 03:55:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:42.398764 | orchestrator | 2025-09-27 03:55:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:42.399911 | orchestrator | 2025-09-27 03:55:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:42.400172 | orchestrator | 2025-09-27 03:55:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:45.451500 | orchestrator | 2025-09-27 03:55:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:45.452172 | orchestrator | 2025-09-27 03:55:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:45.452203 | orchestrator | 2025-09-27 03:55:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:48.501233 | orchestrator | 2025-09-27 03:55:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:48.507033 | orchestrator | 2025-09-27 03:55:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:48.507068 | orchestrator | 2025-09-27 03:55:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:51.559410 | orchestrator | 2025-09-27 03:55:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:51.561008 | orchestrator | 2025-09-27 03:55:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:51.561043 | orchestrator | 2025-09-27 03:55:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:54.607175 | orchestrator | 2025-09-27 03:55:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:54.607274 | orchestrator | 2025-09-27 03:55:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:54.607287 | orchestrator | 2025-09-27 03:55:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:55:57.654319 | orchestrator | 2025-09-27 03:55:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:55:57.655813 | orchestrator | 2025-09-27 03:55:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:55:57.655841 | orchestrator | 2025-09-27 03:55:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:00.705284 | orchestrator | 2025-09-27 03:56:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:00.706672 | orchestrator | 2025-09-27 03:56:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:00.706710 | orchestrator | 2025-09-27 03:56:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:03.755157 | orchestrator | 2025-09-27 03:56:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:03.756774 | orchestrator | 2025-09-27 03:56:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:03.756873 | orchestrator | 2025-09-27 03:56:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:06.798318 | orchestrator | 2025-09-27 03:56:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:06.799841 | orchestrator | 2025-09-27 03:56:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:06.799921 | orchestrator | 2025-09-27 03:56:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:09.854315 | orchestrator | 2025-09-27 03:56:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:09.856166 | orchestrator | 2025-09-27 03:56:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:09.856196 | orchestrator | 2025-09-27 03:56:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:12.916727 | orchestrator | 2025-09-27 03:56:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:12.917745 | orchestrator | 2025-09-27 03:56:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:12.917774 | orchestrator | 2025-09-27 03:56:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:15.966546 | orchestrator | 2025-09-27 03:56:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:15.968163 | orchestrator | 2025-09-27 03:56:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:15.968200 | orchestrator | 2025-09-27 03:56:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:19.017544 | orchestrator | 2025-09-27 03:56:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:19.019945 | orchestrator | 2025-09-27 03:56:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:19.020016 | orchestrator | 2025-09-27 03:56:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:22.064490 | orchestrator | 2025-09-27 03:56:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:22.066370 | orchestrator | 2025-09-27 03:56:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:22.066411 | orchestrator | 2025-09-27 03:56:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:25.110839 | orchestrator | 2025-09-27 03:56:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:25.113163 | orchestrator | 2025-09-27 03:56:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:25.113263 | orchestrator | 2025-09-27 03:56:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:28.155106 | orchestrator | 2025-09-27 03:56:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:28.155645 | orchestrator | 2025-09-27 03:56:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:28.155697 | orchestrator | 2025-09-27 03:56:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:31.199827 | orchestrator | 2025-09-27 03:56:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:31.200901 | orchestrator | 2025-09-27 03:56:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:31.201020 | orchestrator | 2025-09-27 03:56:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:34.249208 | orchestrator | 2025-09-27 03:56:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:34.250998 | orchestrator | 2025-09-27 03:56:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:34.251033 | orchestrator | 2025-09-27 03:56:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:37.303181 | orchestrator | 2025-09-27 03:56:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:37.305273 | orchestrator | 2025-09-27 03:56:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:37.305661 | orchestrator | 2025-09-27 03:56:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:40.353115 | orchestrator | 2025-09-27 03:56:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:40.354708 | orchestrator | 2025-09-27 03:56:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:40.354744 | orchestrator | 2025-09-27 03:56:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:43.401749 | orchestrator | 2025-09-27 03:56:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:43.402861 | orchestrator | 2025-09-27 03:56:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:43.402908 | orchestrator | 2025-09-27 03:56:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:46.450519 | orchestrator | 2025-09-27 03:56:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:46.452218 | orchestrator | 2025-09-27 03:56:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:46.452311 | orchestrator | 2025-09-27 03:56:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:49.499430 | orchestrator | 2025-09-27 03:56:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:49.500985 | orchestrator | 2025-09-27 03:56:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:49.501006 | orchestrator | 2025-09-27 03:56:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:52.548626 | orchestrator | 2025-09-27 03:56:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:52.550310 | orchestrator | 2025-09-27 03:56:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:52.550341 | orchestrator | 2025-09-27 03:56:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:55.596110 | orchestrator | 2025-09-27 03:56:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:55.597865 | orchestrator | 2025-09-27 03:56:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:55.597912 | orchestrator | 2025-09-27 03:56:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:56:58.635228 | orchestrator | 2025-09-27 03:56:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:56:58.636888 | orchestrator | 2025-09-27 03:56:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:56:58.636935 | orchestrator | 2025-09-27 03:56:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:01.686875 | orchestrator | 2025-09-27 03:57:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:01.688486 | orchestrator | 2025-09-27 03:57:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:01.688523 | orchestrator | 2025-09-27 03:57:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:04.735178 | orchestrator | 2025-09-27 03:57:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:04.737199 | orchestrator | 2025-09-27 03:57:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:04.737498 | orchestrator | 2025-09-27 03:57:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:07.785681 | orchestrator | 2025-09-27 03:57:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:07.789193 | orchestrator | 2025-09-27 03:57:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:07.789225 | orchestrator | 2025-09-27 03:57:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:10.840805 | orchestrator | 2025-09-27 03:57:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:10.843041 | orchestrator | 2025-09-27 03:57:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:10.843448 | orchestrator | 2025-09-27 03:57:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:13.891724 | orchestrator | 2025-09-27 03:57:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:13.893097 | orchestrator | 2025-09-27 03:57:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:13.893644 | orchestrator | 2025-09-27 03:57:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:16.941354 | orchestrator | 2025-09-27 03:57:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:16.943297 | orchestrator | 2025-09-27 03:57:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:16.943338 | orchestrator | 2025-09-27 03:57:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:19.989104 | orchestrator | 2025-09-27 03:57:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:19.990989 | orchestrator | 2025-09-27 03:57:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:19.991027 | orchestrator | 2025-09-27 03:57:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:23.040289 | orchestrator | 2025-09-27 03:57:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:23.041506 | orchestrator | 2025-09-27 03:57:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:23.041758 | orchestrator | 2025-09-27 03:57:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:26.085933 | orchestrator | 2025-09-27 03:57:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:26.087035 | orchestrator | 2025-09-27 03:57:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:26.087070 | orchestrator | 2025-09-27 03:57:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:29.134149 | orchestrator | 2025-09-27 03:57:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:29.135231 | orchestrator | 2025-09-27 03:57:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:29.135266 | orchestrator | 2025-09-27 03:57:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:32.184069 | orchestrator | 2025-09-27 03:57:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:32.185718 | orchestrator | 2025-09-27 03:57:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:32.185749 | orchestrator | 2025-09-27 03:57:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:35.230545 | orchestrator | 2025-09-27 03:57:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:35.232125 | orchestrator | 2025-09-27 03:57:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:35.232230 | orchestrator | 2025-09-27 03:57:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:38.276516 | orchestrator | 2025-09-27 03:57:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:38.277377 | orchestrator | 2025-09-27 03:57:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:38.277414 | orchestrator | 2025-09-27 03:57:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:41.324118 | orchestrator | 2025-09-27 03:57:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:41.325978 | orchestrator | 2025-09-27 03:57:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:41.326439 | orchestrator | 2025-09-27 03:57:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:44.375779 | orchestrator | 2025-09-27 03:57:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:44.378258 | orchestrator | 2025-09-27 03:57:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:44.378349 | orchestrator | 2025-09-27 03:57:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:47.429257 | orchestrator | 2025-09-27 03:57:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:47.430355 | orchestrator | 2025-09-27 03:57:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:47.430786 | orchestrator | 2025-09-27 03:57:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:50.479996 | orchestrator | 2025-09-27 03:57:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:50.481873 | orchestrator | 2025-09-27 03:57:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:50.481918 | orchestrator | 2025-09-27 03:57:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:53.526553 | orchestrator | 2025-09-27 03:57:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:53.527687 | orchestrator | 2025-09-27 03:57:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:53.527751 | orchestrator | 2025-09-27 03:57:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:56.571193 | orchestrator | 2025-09-27 03:57:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:56.573789 | orchestrator | 2025-09-27 03:57:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:56.573820 | orchestrator | 2025-09-27 03:57:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:57:59.613931 | orchestrator | 2025-09-27 03:57:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:57:59.614771 | orchestrator | 2025-09-27 03:57:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:57:59.614796 | orchestrator | 2025-09-27 03:57:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:02.660770 | orchestrator | 2025-09-27 03:58:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:02.671733 | orchestrator | 2025-09-27 03:58:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:02.671780 | orchestrator | 2025-09-27 03:58:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:05.712668 | orchestrator | 2025-09-27 03:58:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:05.716363 | orchestrator | 2025-09-27 03:58:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:05.716396 | orchestrator | 2025-09-27 03:58:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:08.761917 | orchestrator | 2025-09-27 03:58:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:08.762924 | orchestrator | 2025-09-27 03:58:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:08.763121 | orchestrator | 2025-09-27 03:58:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:11.816096 | orchestrator | 2025-09-27 03:58:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:11.817099 | orchestrator | 2025-09-27 03:58:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:11.817128 | orchestrator | 2025-09-27 03:58:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:14.869367 | orchestrator | 2025-09-27 03:58:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:14.870410 | orchestrator | 2025-09-27 03:58:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:14.870445 | orchestrator | 2025-09-27 03:58:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:17.914115 | orchestrator | 2025-09-27 03:58:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:17.915919 | orchestrator | 2025-09-27 03:58:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:17.916105 | orchestrator | 2025-09-27 03:58:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:20.961393 | orchestrator | 2025-09-27 03:58:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:20.963902 | orchestrator | 2025-09-27 03:58:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:20.963992 | orchestrator | 2025-09-27 03:58:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:24.015202 | orchestrator | 2025-09-27 03:58:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:24.016438 | orchestrator | 2025-09-27 03:58:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:24.016658 | orchestrator | 2025-09-27 03:58:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:27.064007 | orchestrator | 2025-09-27 03:58:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:27.065567 | orchestrator | 2025-09-27 03:58:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:27.066139 | orchestrator | 2025-09-27 03:58:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:30.113851 | orchestrator | 2025-09-27 03:58:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:30.115451 | orchestrator | 2025-09-27 03:58:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:30.115496 | orchestrator | 2025-09-27 03:58:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:33.164837 | orchestrator | 2025-09-27 03:58:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:33.165559 | orchestrator | 2025-09-27 03:58:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:33.166754 | orchestrator | 2025-09-27 03:58:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:36.217228 | orchestrator | 2025-09-27 03:58:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:36.220818 | orchestrator | 2025-09-27 03:58:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:36.221105 | orchestrator | 2025-09-27 03:58:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:39.268066 | orchestrator | 2025-09-27 03:58:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:39.269866 | orchestrator | 2025-09-27 03:58:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:39.269903 | orchestrator | 2025-09-27 03:58:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:42.318411 | orchestrator | 2025-09-27 03:58:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:42.320225 | orchestrator | 2025-09-27 03:58:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:42.320254 | orchestrator | 2025-09-27 03:58:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:45.364527 | orchestrator | 2025-09-27 03:58:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:45.367067 | orchestrator | 2025-09-27 03:58:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:45.367158 | orchestrator | 2025-09-27 03:58:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:48.412073 | orchestrator | 2025-09-27 03:58:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:48.412895 | orchestrator | 2025-09-27 03:58:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:48.412986 | orchestrator | 2025-09-27 03:58:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:51.457380 | orchestrator | 2025-09-27 03:58:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:51.458578 | orchestrator | 2025-09-27 03:58:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:51.458745 | orchestrator | 2025-09-27 03:58:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:54.500541 | orchestrator | 2025-09-27 03:58:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:54.501422 | orchestrator | 2025-09-27 03:58:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:54.501453 | orchestrator | 2025-09-27 03:58:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:58:57.544890 | orchestrator | 2025-09-27 03:58:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:58:57.546501 | orchestrator | 2025-09-27 03:58:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:58:57.546596 | orchestrator | 2025-09-27 03:58:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:00.589592 | orchestrator | 2025-09-27 03:59:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:00.592856 | orchestrator | 2025-09-27 03:59:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:00.592888 | orchestrator | 2025-09-27 03:59:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:03.643873 | orchestrator | 2025-09-27 03:59:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:03.647459 | orchestrator | 2025-09-27 03:59:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:03.647511 | orchestrator | 2025-09-27 03:59:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:06.696166 | orchestrator | 2025-09-27 03:59:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:06.697472 | orchestrator | 2025-09-27 03:59:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:06.697706 | orchestrator | 2025-09-27 03:59:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:09.742813 | orchestrator | 2025-09-27 03:59:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:09.745383 | orchestrator | 2025-09-27 03:59:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:09.745414 | orchestrator | 2025-09-27 03:59:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:12.794994 | orchestrator | 2025-09-27 03:59:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:12.797430 | orchestrator | 2025-09-27 03:59:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:12.797841 | orchestrator | 2025-09-27 03:59:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:15.845062 | orchestrator | 2025-09-27 03:59:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:15.846192 | orchestrator | 2025-09-27 03:59:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:15.846325 | orchestrator | 2025-09-27 03:59:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:18.890313 | orchestrator | 2025-09-27 03:59:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:18.892144 | orchestrator | 2025-09-27 03:59:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:18.892173 | orchestrator | 2025-09-27 03:59:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:21.937872 | orchestrator | 2025-09-27 03:59:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:21.939152 | orchestrator | 2025-09-27 03:59:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:21.939182 | orchestrator | 2025-09-27 03:59:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:24.982464 | orchestrator | 2025-09-27 03:59:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:24.983849 | orchestrator | 2025-09-27 03:59:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:24.983894 | orchestrator | 2025-09-27 03:59:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:28.032679 | orchestrator | 2025-09-27 03:59:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:28.033911 | orchestrator | 2025-09-27 03:59:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:28.033955 | orchestrator | 2025-09-27 03:59:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:31.078492 | orchestrator | 2025-09-27 03:59:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:31.079674 | orchestrator | 2025-09-27 03:59:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:31.079760 | orchestrator | 2025-09-27 03:59:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:34.123444 | orchestrator | 2025-09-27 03:59:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:34.125772 | orchestrator | 2025-09-27 03:59:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:34.125831 | orchestrator | 2025-09-27 03:59:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:37.168825 | orchestrator | 2025-09-27 03:59:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:37.170200 | orchestrator | 2025-09-27 03:59:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:37.170231 | orchestrator | 2025-09-27 03:59:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:40.211386 | orchestrator | 2025-09-27 03:59:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:40.213012 | orchestrator | 2025-09-27 03:59:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:40.213039 | orchestrator | 2025-09-27 03:59:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:43.260367 | orchestrator | 2025-09-27 03:59:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:43.261490 | orchestrator | 2025-09-27 03:59:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:43.261574 | orchestrator | 2025-09-27 03:59:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:46.311681 | orchestrator | 2025-09-27 03:59:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:46.313444 | orchestrator | 2025-09-27 03:59:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:46.313714 | orchestrator | 2025-09-27 03:59:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:49.358627 | orchestrator | 2025-09-27 03:59:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:49.359965 | orchestrator | 2025-09-27 03:59:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:49.359993 | orchestrator | 2025-09-27 03:59:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:52.410447 | orchestrator | 2025-09-27 03:59:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:52.411709 | orchestrator | 2025-09-27 03:59:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:52.411845 | orchestrator | 2025-09-27 03:59:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:55.457397 | orchestrator | 2025-09-27 03:59:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:55.458788 | orchestrator | 2025-09-27 03:59:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:55.458830 | orchestrator | 2025-09-27 03:59:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 03:59:58.511494 | orchestrator | 2025-09-27 03:59:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 03:59:58.512903 | orchestrator | 2025-09-27 03:59:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 03:59:58.512933 | orchestrator | 2025-09-27 03:59:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:01.554377 | orchestrator | 2025-09-27 04:00:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:01.557221 | orchestrator | 2025-09-27 04:00:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:01.557530 | orchestrator | 2025-09-27 04:00:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:04.604020 | orchestrator | 2025-09-27 04:00:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:04.605763 | orchestrator | 2025-09-27 04:00:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:04.605923 | orchestrator | 2025-09-27 04:00:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:07.655115 | orchestrator | 2025-09-27 04:00:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:07.656127 | orchestrator | 2025-09-27 04:00:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:07.656285 | orchestrator | 2025-09-27 04:00:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:10.706271 | orchestrator | 2025-09-27 04:00:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:10.708685 | orchestrator | 2025-09-27 04:00:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:10.708941 | orchestrator | 2025-09-27 04:00:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:13.754353 | orchestrator | 2025-09-27 04:00:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:13.756437 | orchestrator | 2025-09-27 04:00:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:13.756513 | orchestrator | 2025-09-27 04:00:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:16.804947 | orchestrator | 2025-09-27 04:00:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:16.806554 | orchestrator | 2025-09-27 04:00:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:16.806666 | orchestrator | 2025-09-27 04:00:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:19.853282 | orchestrator | 2025-09-27 04:00:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:19.855366 | orchestrator | 2025-09-27 04:00:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:19.855579 | orchestrator | 2025-09-27 04:00:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:22.901057 | orchestrator | 2025-09-27 04:00:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:22.902561 | orchestrator | 2025-09-27 04:00:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:22.902591 | orchestrator | 2025-09-27 04:00:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:25.948457 | orchestrator | 2025-09-27 04:00:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:25.951040 | orchestrator | 2025-09-27 04:00:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:25.951073 | orchestrator | 2025-09-27 04:00:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:28.990308 | orchestrator | 2025-09-27 04:00:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:28.991773 | orchestrator | 2025-09-27 04:00:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:28.991852 | orchestrator | 2025-09-27 04:00:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:32.033268 | orchestrator | 2025-09-27 04:00:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:32.034793 | orchestrator | 2025-09-27 04:00:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:32.034836 | orchestrator | 2025-09-27 04:00:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:35.074344 | orchestrator | 2025-09-27 04:00:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:35.076258 | orchestrator | 2025-09-27 04:00:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:35.076284 | orchestrator | 2025-09-27 04:00:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:38.127633 | orchestrator | 2025-09-27 04:00:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:38.129843 | orchestrator | 2025-09-27 04:00:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:38.130187 | orchestrator | 2025-09-27 04:00:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:41.176773 | orchestrator | 2025-09-27 04:00:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:41.179460 | orchestrator | 2025-09-27 04:00:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:41.179554 | orchestrator | 2025-09-27 04:00:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:44.228560 | orchestrator | 2025-09-27 04:00:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:44.230628 | orchestrator | 2025-09-27 04:00:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:44.230692 | orchestrator | 2025-09-27 04:00:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:47.279308 | orchestrator | 2025-09-27 04:00:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:47.281225 | orchestrator | 2025-09-27 04:00:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:47.281427 | orchestrator | 2025-09-27 04:00:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:50.325819 | orchestrator | 2025-09-27 04:00:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:50.327985 | orchestrator | 2025-09-27 04:00:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:50.328053 | orchestrator | 2025-09-27 04:00:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:53.373872 | orchestrator | 2025-09-27 04:00:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:53.375103 | orchestrator | 2025-09-27 04:00:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:53.375378 | orchestrator | 2025-09-27 04:00:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:56.419092 | orchestrator | 2025-09-27 04:00:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:56.420377 | orchestrator | 2025-09-27 04:00:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:56.420635 | orchestrator | 2025-09-27 04:00:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:00:59.467388 | orchestrator | 2025-09-27 04:00:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:00:59.469316 | orchestrator | 2025-09-27 04:00:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:00:59.469418 | orchestrator | 2025-09-27 04:00:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:02.518804 | orchestrator | 2025-09-27 04:01:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:02.519929 | orchestrator | 2025-09-27 04:01:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:02.520013 | orchestrator | 2025-09-27 04:01:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:05.564625 | orchestrator | 2025-09-27 04:01:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:05.565820 | orchestrator | 2025-09-27 04:01:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:05.565850 | orchestrator | 2025-09-27 04:01:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:08.608792 | orchestrator | 2025-09-27 04:01:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:08.609465 | orchestrator | 2025-09-27 04:01:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:08.609493 | orchestrator | 2025-09-27 04:01:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:11.653278 | orchestrator | 2025-09-27 04:01:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:11.654919 | orchestrator | 2025-09-27 04:01:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:11.654958 | orchestrator | 2025-09-27 04:01:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:14.701616 | orchestrator | 2025-09-27 04:01:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:14.702586 | orchestrator | 2025-09-27 04:01:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:14.702620 | orchestrator | 2025-09-27 04:01:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:17.746203 | orchestrator | 2025-09-27 04:01:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:17.747153 | orchestrator | 2025-09-27 04:01:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:17.747243 | orchestrator | 2025-09-27 04:01:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:20.795693 | orchestrator | 2025-09-27 04:01:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:20.797014 | orchestrator | 2025-09-27 04:01:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:20.797128 | orchestrator | 2025-09-27 04:01:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:23.846264 | orchestrator | 2025-09-27 04:01:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:23.847260 | orchestrator | 2025-09-27 04:01:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:23.847496 | orchestrator | 2025-09-27 04:01:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:26.890282 | orchestrator | 2025-09-27 04:01:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:26.890862 | orchestrator | 2025-09-27 04:01:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:26.890891 | orchestrator | 2025-09-27 04:01:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:29.933638 | orchestrator | 2025-09-27 04:01:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:29.936354 | orchestrator | 2025-09-27 04:01:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:29.936838 | orchestrator | 2025-09-27 04:01:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:32.984910 | orchestrator | 2025-09-27 04:01:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:32.986352 | orchestrator | 2025-09-27 04:01:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:32.986386 | orchestrator | 2025-09-27 04:01:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:36.037860 | orchestrator | 2025-09-27 04:01:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:36.039205 | orchestrator | 2025-09-27 04:01:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:36.039237 | orchestrator | 2025-09-27 04:01:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:39.084839 | orchestrator | 2025-09-27 04:01:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:39.085802 | orchestrator | 2025-09-27 04:01:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:39.085954 | orchestrator | 2025-09-27 04:01:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:42.137058 | orchestrator | 2025-09-27 04:01:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:42.138305 | orchestrator | 2025-09-27 04:01:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:42.138339 | orchestrator | 2025-09-27 04:01:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:45.178615 | orchestrator | 2025-09-27 04:01:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:45.181427 | orchestrator | 2025-09-27 04:01:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:45.181459 | orchestrator | 2025-09-27 04:01:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:48.234726 | orchestrator | 2025-09-27 04:01:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:48.236908 | orchestrator | 2025-09-27 04:01:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:48.236924 | orchestrator | 2025-09-27 04:01:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:51.283484 | orchestrator | 2025-09-27 04:01:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:51.285643 | orchestrator | 2025-09-27 04:01:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:51.285980 | orchestrator | 2025-09-27 04:01:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:54.326426 | orchestrator | 2025-09-27 04:01:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:54.328041 | orchestrator | 2025-09-27 04:01:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:54.328128 | orchestrator | 2025-09-27 04:01:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:01:57.375606 | orchestrator | 2025-09-27 04:01:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:01:57.377262 | orchestrator | 2025-09-27 04:01:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:01:57.377345 | orchestrator | 2025-09-27 04:01:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:00.422993 | orchestrator | 2025-09-27 04:02:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:00.423854 | orchestrator | 2025-09-27 04:02:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:00.423948 | orchestrator | 2025-09-27 04:02:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:03.471566 | orchestrator | 2025-09-27 04:02:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:03.473708 | orchestrator | 2025-09-27 04:02:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:03.473795 | orchestrator | 2025-09-27 04:02:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:06.516445 | orchestrator | 2025-09-27 04:02:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:06.518009 | orchestrator | 2025-09-27 04:02:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:06.518107 | orchestrator | 2025-09-27 04:02:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:09.565135 | orchestrator | 2025-09-27 04:02:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:09.567069 | orchestrator | 2025-09-27 04:02:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:09.567101 | orchestrator | 2025-09-27 04:02:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:12.618639 | orchestrator | 2025-09-27 04:02:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:12.618914 | orchestrator | 2025-09-27 04:02:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:12.619390 | orchestrator | 2025-09-27 04:02:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:15.667035 | orchestrator | 2025-09-27 04:02:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:15.670302 | orchestrator | 2025-09-27 04:02:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:15.670895 | orchestrator | 2025-09-27 04:02:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:18.727577 | orchestrator | 2025-09-27 04:02:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:18.729880 | orchestrator | 2025-09-27 04:02:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:18.730221 | orchestrator | 2025-09-27 04:02:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:21.777182 | orchestrator | 2025-09-27 04:02:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:21.778543 | orchestrator | 2025-09-27 04:02:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:21.778865 | orchestrator | 2025-09-27 04:02:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:24.829834 | orchestrator | 2025-09-27 04:02:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:24.833546 | orchestrator | 2025-09-27 04:02:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:24.833581 | orchestrator | 2025-09-27 04:02:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:27.886738 | orchestrator | 2025-09-27 04:02:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:27.889457 | orchestrator | 2025-09-27 04:02:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:27.889700 | orchestrator | 2025-09-27 04:02:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:30.938286 | orchestrator | 2025-09-27 04:02:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:30.938837 | orchestrator | 2025-09-27 04:02:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:30.938936 | orchestrator | 2025-09-27 04:02:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:33.989525 | orchestrator | 2025-09-27 04:02:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:33.990969 | orchestrator | 2025-09-27 04:02:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:33.991004 | orchestrator | 2025-09-27 04:02:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:37.036215 | orchestrator | 2025-09-27 04:02:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:37.037482 | orchestrator | 2025-09-27 04:02:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:37.037563 | orchestrator | 2025-09-27 04:02:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:40.083486 | orchestrator | 2025-09-27 04:02:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:40.085656 | orchestrator | 2025-09-27 04:02:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:40.085870 | orchestrator | 2025-09-27 04:02:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:43.125792 | orchestrator | 2025-09-27 04:02:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:43.127347 | orchestrator | 2025-09-27 04:02:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:43.127441 | orchestrator | 2025-09-27 04:02:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:46.170533 | orchestrator | 2025-09-27 04:02:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:46.170893 | orchestrator | 2025-09-27 04:02:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:46.170922 | orchestrator | 2025-09-27 04:02:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:49.221330 | orchestrator | 2025-09-27 04:02:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:49.223869 | orchestrator | 2025-09-27 04:02:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:49.224252 | orchestrator | 2025-09-27 04:02:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:52.273755 | orchestrator | 2025-09-27 04:02:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:52.275441 | orchestrator | 2025-09-27 04:02:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:52.275492 | orchestrator | 2025-09-27 04:02:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:55.316370 | orchestrator | 2025-09-27 04:02:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:55.317319 | orchestrator | 2025-09-27 04:02:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:55.317355 | orchestrator | 2025-09-27 04:02:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:02:58.366390 | orchestrator | 2025-09-27 04:02:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:02:58.367260 | orchestrator | 2025-09-27 04:02:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:02:58.367297 | orchestrator | 2025-09-27 04:02:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:01.409321 | orchestrator | 2025-09-27 04:03:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:01.411304 | orchestrator | 2025-09-27 04:03:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:01.411342 | orchestrator | 2025-09-27 04:03:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:04.456261 | orchestrator | 2025-09-27 04:03:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:04.459032 | orchestrator | 2025-09-27 04:03:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:04.459069 | orchestrator | 2025-09-27 04:03:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:07.509741 | orchestrator | 2025-09-27 04:03:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:07.510894 | orchestrator | 2025-09-27 04:03:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:07.511121 | orchestrator | 2025-09-27 04:03:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:10.560755 | orchestrator | 2025-09-27 04:03:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:10.564101 | orchestrator | 2025-09-27 04:03:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:10.564136 | orchestrator | 2025-09-27 04:03:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:13.613117 | orchestrator | 2025-09-27 04:03:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:13.617322 | orchestrator | 2025-09-27 04:03:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:13.617370 | orchestrator | 2025-09-27 04:03:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:16.668118 | orchestrator | 2025-09-27 04:03:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:16.670272 | orchestrator | 2025-09-27 04:03:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:16.670305 | orchestrator | 2025-09-27 04:03:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:19.710288 | orchestrator | 2025-09-27 04:03:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:19.711454 | orchestrator | 2025-09-27 04:03:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:19.711535 | orchestrator | 2025-09-27 04:03:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:22.762245 | orchestrator | 2025-09-27 04:03:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:22.764781 | orchestrator | 2025-09-27 04:03:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:22.764866 | orchestrator | 2025-09-27 04:03:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:25.808236 | orchestrator | 2025-09-27 04:03:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:25.809427 | orchestrator | 2025-09-27 04:03:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:25.809470 | orchestrator | 2025-09-27 04:03:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:28.862455 | orchestrator | 2025-09-27 04:03:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:28.863556 | orchestrator | 2025-09-27 04:03:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:28.863796 | orchestrator | 2025-09-27 04:03:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:31.911957 | orchestrator | 2025-09-27 04:03:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:31.913614 | orchestrator | 2025-09-27 04:03:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:31.914213 | orchestrator | 2025-09-27 04:03:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:34.965369 | orchestrator | 2025-09-27 04:03:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:34.967578 | orchestrator | 2025-09-27 04:03:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:34.967608 | orchestrator | 2025-09-27 04:03:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:38.022290 | orchestrator | 2025-09-27 04:03:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:38.023629 | orchestrator | 2025-09-27 04:03:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:38.023787 | orchestrator | 2025-09-27 04:03:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:41.078423 | orchestrator | 2025-09-27 04:03:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:41.079943 | orchestrator | 2025-09-27 04:03:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:41.080054 | orchestrator | 2025-09-27 04:03:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:44.128803 | orchestrator | 2025-09-27 04:03:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:44.130354 | orchestrator | 2025-09-27 04:03:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:44.130380 | orchestrator | 2025-09-27 04:03:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:47.187934 | orchestrator | 2025-09-27 04:03:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:47.188008 | orchestrator | 2025-09-27 04:03:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:47.188133 | orchestrator | 2025-09-27 04:03:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:50.239819 | orchestrator | 2025-09-27 04:03:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:50.241412 | orchestrator | 2025-09-27 04:03:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:50.241450 | orchestrator | 2025-09-27 04:03:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:53.293186 | orchestrator | 2025-09-27 04:03:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:53.294248 | orchestrator | 2025-09-27 04:03:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:53.294285 | orchestrator | 2025-09-27 04:03:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:56.330357 | orchestrator | 2025-09-27 04:03:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:56.331658 | orchestrator | 2025-09-27 04:03:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:56.331735 | orchestrator | 2025-09-27 04:03:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:03:59.378167 | orchestrator | 2025-09-27 04:03:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:03:59.379011 | orchestrator | 2025-09-27 04:03:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:03:59.379048 | orchestrator | 2025-09-27 04:03:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:02.424998 | orchestrator | 2025-09-27 04:04:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:02.426696 | orchestrator | 2025-09-27 04:04:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:02.426738 | orchestrator | 2025-09-27 04:04:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:05.470824 | orchestrator | 2025-09-27 04:04:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:05.472383 | orchestrator | 2025-09-27 04:04:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:05.472431 | orchestrator | 2025-09-27 04:04:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:08.531947 | orchestrator | 2025-09-27 04:04:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:08.533235 | orchestrator | 2025-09-27 04:04:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:08.533271 | orchestrator | 2025-09-27 04:04:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:11.581511 | orchestrator | 2025-09-27 04:04:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:11.583364 | orchestrator | 2025-09-27 04:04:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:11.583547 | orchestrator | 2025-09-27 04:04:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:14.641431 | orchestrator | 2025-09-27 04:04:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:14.642601 | orchestrator | 2025-09-27 04:04:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:14.643208 | orchestrator | 2025-09-27 04:04:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:17.698185 | orchestrator | 2025-09-27 04:04:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:17.698864 | orchestrator | 2025-09-27 04:04:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:17.699201 | orchestrator | 2025-09-27 04:04:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:20.752805 | orchestrator | 2025-09-27 04:04:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:20.753954 | orchestrator | 2025-09-27 04:04:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:20.753994 | orchestrator | 2025-09-27 04:04:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:23.803143 | orchestrator | 2025-09-27 04:04:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:23.804150 | orchestrator | 2025-09-27 04:04:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:23.804246 | orchestrator | 2025-09-27 04:04:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:26.847869 | orchestrator | 2025-09-27 04:04:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:26.851329 | orchestrator | 2025-09-27 04:04:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:26.851382 | orchestrator | 2025-09-27 04:04:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:29.891999 | orchestrator | 2025-09-27 04:04:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:29.893431 | orchestrator | 2025-09-27 04:04:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:29.893642 | orchestrator | 2025-09-27 04:04:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:32.937826 | orchestrator | 2025-09-27 04:04:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:32.940116 | orchestrator | 2025-09-27 04:04:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:32.940147 | orchestrator | 2025-09-27 04:04:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:35.985228 | orchestrator | 2025-09-27 04:04:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:35.987015 | orchestrator | 2025-09-27 04:04:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:35.987093 | orchestrator | 2025-09-27 04:04:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:39.035078 | orchestrator | 2025-09-27 04:04:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:39.036000 | orchestrator | 2025-09-27 04:04:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:39.036036 | orchestrator | 2025-09-27 04:04:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:42.085197 | orchestrator | 2025-09-27 04:04:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:42.088077 | orchestrator | 2025-09-27 04:04:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:42.088166 | orchestrator | 2025-09-27 04:04:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:45.132280 | orchestrator | 2025-09-27 04:04:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:45.133479 | orchestrator | 2025-09-27 04:04:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:45.133524 | orchestrator | 2025-09-27 04:04:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:48.179633 | orchestrator | 2025-09-27 04:04:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:48.181099 | orchestrator | 2025-09-27 04:04:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:48.181117 | orchestrator | 2025-09-27 04:04:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:51.223831 | orchestrator | 2025-09-27 04:04:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:51.224548 | orchestrator | 2025-09-27 04:04:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:51.224580 | orchestrator | 2025-09-27 04:04:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:54.271354 | orchestrator | 2025-09-27 04:04:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:54.272378 | orchestrator | 2025-09-27 04:04:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:54.272410 | orchestrator | 2025-09-27 04:04:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:04:57.319127 | orchestrator | 2025-09-27 04:04:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:04:57.320625 | orchestrator | 2025-09-27 04:04:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:04:57.321062 | orchestrator | 2025-09-27 04:04:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:00.369553 | orchestrator | 2025-09-27 04:05:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:00.370785 | orchestrator | 2025-09-27 04:05:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:00.370825 | orchestrator | 2025-09-27 04:05:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:03.416320 | orchestrator | 2025-09-27 04:05:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:03.417340 | orchestrator | 2025-09-27 04:05:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:03.417367 | orchestrator | 2025-09-27 04:05:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:06.467507 | orchestrator | 2025-09-27 04:05:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:06.469636 | orchestrator | 2025-09-27 04:05:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:06.469780 | orchestrator | 2025-09-27 04:05:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:09.516418 | orchestrator | 2025-09-27 04:05:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:09.517215 | orchestrator | 2025-09-27 04:05:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:09.517248 | orchestrator | 2025-09-27 04:05:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:12.557207 | orchestrator | 2025-09-27 04:05:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:12.560184 | orchestrator | 2025-09-27 04:05:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:12.560223 | orchestrator | 2025-09-27 04:05:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:15.606720 | orchestrator | 2025-09-27 04:05:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:15.609147 | orchestrator | 2025-09-27 04:05:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:15.609187 | orchestrator | 2025-09-27 04:05:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:18.652570 | orchestrator | 2025-09-27 04:05:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:18.654122 | orchestrator | 2025-09-27 04:05:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:18.654158 | orchestrator | 2025-09-27 04:05:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:21.699054 | orchestrator | 2025-09-27 04:05:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:21.700905 | orchestrator | 2025-09-27 04:05:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:21.701131 | orchestrator | 2025-09-27 04:05:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:24.748248 | orchestrator | 2025-09-27 04:05:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:24.750128 | orchestrator | 2025-09-27 04:05:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:24.750161 | orchestrator | 2025-09-27 04:05:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:27.793665 | orchestrator | 2025-09-27 04:05:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:27.795122 | orchestrator | 2025-09-27 04:05:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:27.795540 | orchestrator | 2025-09-27 04:05:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:30.833048 | orchestrator | 2025-09-27 04:05:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:30.834464 | orchestrator | 2025-09-27 04:05:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:30.834559 | orchestrator | 2025-09-27 04:05:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:33.881130 | orchestrator | 2025-09-27 04:05:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:33.882961 | orchestrator | 2025-09-27 04:05:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:33.882999 | orchestrator | 2025-09-27 04:05:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:36.926773 | orchestrator | 2025-09-27 04:05:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:36.928348 | orchestrator | 2025-09-27 04:05:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:36.928378 | orchestrator | 2025-09-27 04:05:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:39.978924 | orchestrator | 2025-09-27 04:05:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:39.979712 | orchestrator | 2025-09-27 04:05:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:39.979837 | orchestrator | 2025-09-27 04:05:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:43.029418 | orchestrator | 2025-09-27 04:05:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:43.030076 | orchestrator | 2025-09-27 04:05:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:43.030265 | orchestrator | 2025-09-27 04:05:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:46.079486 | orchestrator | 2025-09-27 04:05:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:46.080644 | orchestrator | 2025-09-27 04:05:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:46.080745 | orchestrator | 2025-09-27 04:05:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:49.119972 | orchestrator | 2025-09-27 04:05:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:49.120441 | orchestrator | 2025-09-27 04:05:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:49.120469 | orchestrator | 2025-09-27 04:05:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:52.170276 | orchestrator | 2025-09-27 04:05:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:52.171727 | orchestrator | 2025-09-27 04:05:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:52.171758 | orchestrator | 2025-09-27 04:05:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:55.220671 | orchestrator | 2025-09-27 04:05:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:55.222411 | orchestrator | 2025-09-27 04:05:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:55.222450 | orchestrator | 2025-09-27 04:05:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:05:58.274827 | orchestrator | 2025-09-27 04:05:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:05:58.275026 | orchestrator | 2025-09-27 04:05:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:05:58.275048 | orchestrator | 2025-09-27 04:05:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:01.317025 | orchestrator | 2025-09-27 04:06:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:01.318200 | orchestrator | 2025-09-27 04:06:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:01.318284 | orchestrator | 2025-09-27 04:06:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:04.362958 | orchestrator | 2025-09-27 04:06:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:04.364348 | orchestrator | 2025-09-27 04:06:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:04.364454 | orchestrator | 2025-09-27 04:06:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:07.415066 | orchestrator | 2025-09-27 04:06:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:07.417393 | orchestrator | 2025-09-27 04:06:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:07.417415 | orchestrator | 2025-09-27 04:06:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:10.467523 | orchestrator | 2025-09-27 04:06:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:10.468595 | orchestrator | 2025-09-27 04:06:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:10.468966 | orchestrator | 2025-09-27 04:06:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:13.514999 | orchestrator | 2025-09-27 04:06:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:13.516296 | orchestrator | 2025-09-27 04:06:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:13.516563 | orchestrator | 2025-09-27 04:06:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:16.562814 | orchestrator | 2025-09-27 04:06:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:16.563489 | orchestrator | 2025-09-27 04:06:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:16.563523 | orchestrator | 2025-09-27 04:06:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:19.609220 | orchestrator | 2025-09-27 04:06:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:19.610966 | orchestrator | 2025-09-27 04:06:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:19.611044 | orchestrator | 2025-09-27 04:06:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:22.659819 | orchestrator | 2025-09-27 04:06:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:22.661024 | orchestrator | 2025-09-27 04:06:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:22.661053 | orchestrator | 2025-09-27 04:06:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:25.703187 | orchestrator | 2025-09-27 04:06:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:25.704913 | orchestrator | 2025-09-27 04:06:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:25.704939 | orchestrator | 2025-09-27 04:06:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:28.747770 | orchestrator | 2025-09-27 04:06:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:28.749741 | orchestrator | 2025-09-27 04:06:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:28.750364 | orchestrator | 2025-09-27 04:06:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:31.795472 | orchestrator | 2025-09-27 04:06:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:31.796564 | orchestrator | 2025-09-27 04:06:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:31.796646 | orchestrator | 2025-09-27 04:06:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:34.840222 | orchestrator | 2025-09-27 04:06:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:34.840426 | orchestrator | 2025-09-27 04:06:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:34.840445 | orchestrator | 2025-09-27 04:06:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:37.888192 | orchestrator | 2025-09-27 04:06:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:37.890282 | orchestrator | 2025-09-27 04:06:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:37.890319 | orchestrator | 2025-09-27 04:06:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:40.934324 | orchestrator | 2025-09-27 04:06:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:40.936128 | orchestrator | 2025-09-27 04:06:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:40.936405 | orchestrator | 2025-09-27 04:06:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:43.983567 | orchestrator | 2025-09-27 04:06:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:43.987089 | orchestrator | 2025-09-27 04:06:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:43.987330 | orchestrator | 2025-09-27 04:06:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:47.038440 | orchestrator | 2025-09-27 04:06:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:47.041016 | orchestrator | 2025-09-27 04:06:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:47.041336 | orchestrator | 2025-09-27 04:06:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:50.088699 | orchestrator | 2025-09-27 04:06:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:50.091679 | orchestrator | 2025-09-27 04:06:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:50.091813 | orchestrator | 2025-09-27 04:06:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:53.143200 | orchestrator | 2025-09-27 04:06:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:53.146678 | orchestrator | 2025-09-27 04:06:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:53.147105 | orchestrator | 2025-09-27 04:06:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:56.199172 | orchestrator | 2025-09-27 04:06:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:56.202136 | orchestrator | 2025-09-27 04:06:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:56.202169 | orchestrator | 2025-09-27 04:06:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:06:59.253930 | orchestrator | 2025-09-27 04:06:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:06:59.256054 | orchestrator | 2025-09-27 04:06:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:06:59.256143 | orchestrator | 2025-09-27 04:06:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:02.307234 | orchestrator | 2025-09-27 04:07:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:02.310690 | orchestrator | 2025-09-27 04:07:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:02.310802 | orchestrator | 2025-09-27 04:07:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:05.359891 | orchestrator | 2025-09-27 04:07:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:05.360996 | orchestrator | 2025-09-27 04:07:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:05.361030 | orchestrator | 2025-09-27 04:07:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:08.410077 | orchestrator | 2025-09-27 04:07:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:08.411594 | orchestrator | 2025-09-27 04:07:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:08.412022 | orchestrator | 2025-09-27 04:07:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:11.458333 | orchestrator | 2025-09-27 04:07:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:11.459903 | orchestrator | 2025-09-27 04:07:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:11.460148 | orchestrator | 2025-09-27 04:07:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:14.504644 | orchestrator | 2025-09-27 04:07:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:14.504930 | orchestrator | 2025-09-27 04:07:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:14.505045 | orchestrator | 2025-09-27 04:07:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:17.547035 | orchestrator | 2025-09-27 04:07:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:17.548023 | orchestrator | 2025-09-27 04:07:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:17.548161 | orchestrator | 2025-09-27 04:07:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:20.593426 | orchestrator | 2025-09-27 04:07:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:20.594818 | orchestrator | 2025-09-27 04:07:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:20.595150 | orchestrator | 2025-09-27 04:07:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:23.640890 | orchestrator | 2025-09-27 04:07:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:23.644097 | orchestrator | 2025-09-27 04:07:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:23.644137 | orchestrator | 2025-09-27 04:07:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:26.687374 | orchestrator | 2025-09-27 04:07:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:26.688609 | orchestrator | 2025-09-27 04:07:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:26.689088 | orchestrator | 2025-09-27 04:07:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:29.733279 | orchestrator | 2025-09-27 04:07:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:29.735266 | orchestrator | 2025-09-27 04:07:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:29.735312 | orchestrator | 2025-09-27 04:07:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:32.786998 | orchestrator | 2025-09-27 04:07:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:32.789805 | orchestrator | 2025-09-27 04:07:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:32.789879 | orchestrator | 2025-09-27 04:07:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:35.832534 | orchestrator | 2025-09-27 04:07:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:35.834553 | orchestrator | 2025-09-27 04:07:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:35.834594 | orchestrator | 2025-09-27 04:07:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:38.875678 | orchestrator | 2025-09-27 04:07:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:38.877251 | orchestrator | 2025-09-27 04:07:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:38.877369 | orchestrator | 2025-09-27 04:07:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:41.928575 | orchestrator | 2025-09-27 04:07:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:41.931044 | orchestrator | 2025-09-27 04:07:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:41.931167 | orchestrator | 2025-09-27 04:07:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:44.978357 | orchestrator | 2025-09-27 04:07:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:44.980700 | orchestrator | 2025-09-27 04:07:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:44.980779 | orchestrator | 2025-09-27 04:07:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:48.029790 | orchestrator | 2025-09-27 04:07:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:48.031045 | orchestrator | 2025-09-27 04:07:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:48.031167 | orchestrator | 2025-09-27 04:07:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:51.075498 | orchestrator | 2025-09-27 04:07:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:51.075708 | orchestrator | 2025-09-27 04:07:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:51.075976 | orchestrator | 2025-09-27 04:07:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:54.120640 | orchestrator | 2025-09-27 04:07:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:54.122414 | orchestrator | 2025-09-27 04:07:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:54.122455 | orchestrator | 2025-09-27 04:07:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:07:57.173487 | orchestrator | 2025-09-27 04:07:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:07:57.174210 | orchestrator | 2025-09-27 04:07:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:07:57.174250 | orchestrator | 2025-09-27 04:07:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:00.212924 | orchestrator | 2025-09-27 04:08:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:00.215346 | orchestrator | 2025-09-27 04:08:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:00.215385 | orchestrator | 2025-09-27 04:08:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:03.261475 | orchestrator | 2025-09-27 04:08:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:03.263190 | orchestrator | 2025-09-27 04:08:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:03.263232 | orchestrator | 2025-09-27 04:08:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:06.308591 | orchestrator | 2025-09-27 04:08:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:06.310821 | orchestrator | 2025-09-27 04:08:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:06.310858 | orchestrator | 2025-09-27 04:08:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:09.360544 | orchestrator | 2025-09-27 04:08:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:09.362261 | orchestrator | 2025-09-27 04:08:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:09.362336 | orchestrator | 2025-09-27 04:08:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:12.413421 | orchestrator | 2025-09-27 04:08:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:12.415220 | orchestrator | 2025-09-27 04:08:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:12.415279 | orchestrator | 2025-09-27 04:08:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:15.457445 | orchestrator | 2025-09-27 04:08:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:15.459232 | orchestrator | 2025-09-27 04:08:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:15.459311 | orchestrator | 2025-09-27 04:08:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:18.502542 | orchestrator | 2025-09-27 04:08:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:18.503707 | orchestrator | 2025-09-27 04:08:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:18.503894 | orchestrator | 2025-09-27 04:08:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:21.549204 | orchestrator | 2025-09-27 04:08:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:21.550801 | orchestrator | 2025-09-27 04:08:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:21.550843 | orchestrator | 2025-09-27 04:08:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:24.591069 | orchestrator | 2025-09-27 04:08:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:24.593659 | orchestrator | 2025-09-27 04:08:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:24.593698 | orchestrator | 2025-09-27 04:08:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:27.641195 | orchestrator | 2025-09-27 04:08:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:27.642811 | orchestrator | 2025-09-27 04:08:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:27.642990 | orchestrator | 2025-09-27 04:08:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:30.698143 | orchestrator | 2025-09-27 04:08:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:30.699004 | orchestrator | 2025-09-27 04:08:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:30.699096 | orchestrator | 2025-09-27 04:08:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:33.746545 | orchestrator | 2025-09-27 04:08:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:33.748118 | orchestrator | 2025-09-27 04:08:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:33.748150 | orchestrator | 2025-09-27 04:08:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:36.794388 | orchestrator | 2025-09-27 04:08:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:36.796062 | orchestrator | 2025-09-27 04:08:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:36.796090 | orchestrator | 2025-09-27 04:08:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:39.840104 | orchestrator | 2025-09-27 04:08:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:39.842259 | orchestrator | 2025-09-27 04:08:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:39.842292 | orchestrator | 2025-09-27 04:08:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:42.893454 | orchestrator | 2025-09-27 04:08:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:42.894215 | orchestrator | 2025-09-27 04:08:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:42.894390 | orchestrator | 2025-09-27 04:08:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:45.935882 | orchestrator | 2025-09-27 04:08:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:45.936143 | orchestrator | 2025-09-27 04:08:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:45.936182 | orchestrator | 2025-09-27 04:08:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:48.980665 | orchestrator | 2025-09-27 04:08:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:48.982911 | orchestrator | 2025-09-27 04:08:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:48.982949 | orchestrator | 2025-09-27 04:08:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:52.025819 | orchestrator | 2025-09-27 04:08:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:52.027109 | orchestrator | 2025-09-27 04:08:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:52.027138 | orchestrator | 2025-09-27 04:08:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:55.063004 | orchestrator | 2025-09-27 04:08:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:55.064206 | orchestrator | 2025-09-27 04:08:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:55.064237 | orchestrator | 2025-09-27 04:08:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:08:58.105028 | orchestrator | 2025-09-27 04:08:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:08:58.106387 | orchestrator | 2025-09-27 04:08:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:08:58.106427 | orchestrator | 2025-09-27 04:08:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:01.156372 | orchestrator | 2025-09-27 04:09:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:01.157895 | orchestrator | 2025-09-27 04:09:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:01.158305 | orchestrator | 2025-09-27 04:09:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:04.200304 | orchestrator | 2025-09-27 04:09:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:04.202469 | orchestrator | 2025-09-27 04:09:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:04.202738 | orchestrator | 2025-09-27 04:09:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:07.243335 | orchestrator | 2025-09-27 04:09:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:07.244796 | orchestrator | 2025-09-27 04:09:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:07.244826 | orchestrator | 2025-09-27 04:09:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:10.297149 | orchestrator | 2025-09-27 04:09:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:10.298594 | orchestrator | 2025-09-27 04:09:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:10.298870 | orchestrator | 2025-09-27 04:09:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:13.348051 | orchestrator | 2025-09-27 04:09:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:13.348735 | orchestrator | 2025-09-27 04:09:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:13.348932 | orchestrator | 2025-09-27 04:09:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:16.401148 | orchestrator | 2025-09-27 04:09:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:16.402446 | orchestrator | 2025-09-27 04:09:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:16.402478 | orchestrator | 2025-09-27 04:09:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:19.449810 | orchestrator | 2025-09-27 04:09:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:19.451185 | orchestrator | 2025-09-27 04:09:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:19.451458 | orchestrator | 2025-09-27 04:09:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:22.495525 | orchestrator | 2025-09-27 04:09:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:22.497022 | orchestrator | 2025-09-27 04:09:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:22.497051 | orchestrator | 2025-09-27 04:09:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:25.545001 | orchestrator | 2025-09-27 04:09:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:25.545815 | orchestrator | 2025-09-27 04:09:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:25.546077 | orchestrator | 2025-09-27 04:09:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:28.591162 | orchestrator | 2025-09-27 04:09:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:28.592014 | orchestrator | 2025-09-27 04:09:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:28.592559 | orchestrator | 2025-09-27 04:09:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:31.635904 | orchestrator | 2025-09-27 04:09:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:31.638127 | orchestrator | 2025-09-27 04:09:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:31.638212 | orchestrator | 2025-09-27 04:09:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:34.685076 | orchestrator | 2025-09-27 04:09:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:34.687331 | orchestrator | 2025-09-27 04:09:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:34.687377 | orchestrator | 2025-09-27 04:09:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:37.734915 | orchestrator | 2025-09-27 04:09:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:37.737971 | orchestrator | 2025-09-27 04:09:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:37.738002 | orchestrator | 2025-09-27 04:09:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:40.787488 | orchestrator | 2025-09-27 04:09:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:40.788464 | orchestrator | 2025-09-27 04:09:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:40.788503 | orchestrator | 2025-09-27 04:09:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:43.839223 | orchestrator | 2025-09-27 04:09:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:43.840906 | orchestrator | 2025-09-27 04:09:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:43.840939 | orchestrator | 2025-09-27 04:09:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:46.889961 | orchestrator | 2025-09-27 04:09:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:46.891723 | orchestrator | 2025-09-27 04:09:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:46.891756 | orchestrator | 2025-09-27 04:09:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:49.935070 | orchestrator | 2025-09-27 04:09:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:49.937008 | orchestrator | 2025-09-27 04:09:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:49.937236 | orchestrator | 2025-09-27 04:09:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:52.980072 | orchestrator | 2025-09-27 04:09:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:52.981806 | orchestrator | 2025-09-27 04:09:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:52.981839 | orchestrator | 2025-09-27 04:09:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:56.028957 | orchestrator | 2025-09-27 04:09:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:56.030394 | orchestrator | 2025-09-27 04:09:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:56.030492 | orchestrator | 2025-09-27 04:09:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:09:59.069951 | orchestrator | 2025-09-27 04:09:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:09:59.072752 | orchestrator | 2025-09-27 04:09:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:09:59.072816 | orchestrator | 2025-09-27 04:09:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:02.127340 | orchestrator | 2025-09-27 04:10:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:02.129616 | orchestrator | 2025-09-27 04:10:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:02.129732 | orchestrator | 2025-09-27 04:10:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:05.180635 | orchestrator | 2025-09-27 04:10:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:05.182300 | orchestrator | 2025-09-27 04:10:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:05.182338 | orchestrator | 2025-09-27 04:10:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:08.229392 | orchestrator | 2025-09-27 04:10:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:08.230922 | orchestrator | 2025-09-27 04:10:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:08.231089 | orchestrator | 2025-09-27 04:10:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:11.274161 | orchestrator | 2025-09-27 04:10:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:11.275556 | orchestrator | 2025-09-27 04:10:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:11.275586 | orchestrator | 2025-09-27 04:10:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:14.322753 | orchestrator | 2025-09-27 04:10:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:14.325258 | orchestrator | 2025-09-27 04:10:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:14.325307 | orchestrator | 2025-09-27 04:10:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:17.372217 | orchestrator | 2025-09-27 04:10:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:17.374164 | orchestrator | 2025-09-27 04:10:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:17.374408 | orchestrator | 2025-09-27 04:10:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:20.420120 | orchestrator | 2025-09-27 04:10:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:20.422407 | orchestrator | 2025-09-27 04:10:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:20.422476 | orchestrator | 2025-09-27 04:10:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:23.472242 | orchestrator | 2025-09-27 04:10:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:23.474007 | orchestrator | 2025-09-27 04:10:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:23.474094 | orchestrator | 2025-09-27 04:10:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:26.517202 | orchestrator | 2025-09-27 04:10:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:26.518385 | orchestrator | 2025-09-27 04:10:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:26.518419 | orchestrator | 2025-09-27 04:10:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:29.564451 | orchestrator | 2025-09-27 04:10:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:29.565824 | orchestrator | 2025-09-27 04:10:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:29.565856 | orchestrator | 2025-09-27 04:10:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:32.612203 | orchestrator | 2025-09-27 04:10:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:32.613569 | orchestrator | 2025-09-27 04:10:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:32.613597 | orchestrator | 2025-09-27 04:10:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:35.655367 | orchestrator | 2025-09-27 04:10:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:35.657160 | orchestrator | 2025-09-27 04:10:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:35.657193 | orchestrator | 2025-09-27 04:10:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:38.700956 | orchestrator | 2025-09-27 04:10:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:38.702453 | orchestrator | 2025-09-27 04:10:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:38.702719 | orchestrator | 2025-09-27 04:10:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:41.746194 | orchestrator | 2025-09-27 04:10:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:41.749347 | orchestrator | 2025-09-27 04:10:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:41.749423 | orchestrator | 2025-09-27 04:10:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:44.793540 | orchestrator | 2025-09-27 04:10:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:44.795866 | orchestrator | 2025-09-27 04:10:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:44.795917 | orchestrator | 2025-09-27 04:10:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:47.845160 | orchestrator | 2025-09-27 04:10:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:47.846856 | orchestrator | 2025-09-27 04:10:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:47.847049 | orchestrator | 2025-09-27 04:10:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:50.887999 | orchestrator | 2025-09-27 04:10:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:50.889052 | orchestrator | 2025-09-27 04:10:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:50.889243 | orchestrator | 2025-09-27 04:10:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:53.937264 | orchestrator | 2025-09-27 04:10:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:53.938585 | orchestrator | 2025-09-27 04:10:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:53.938696 | orchestrator | 2025-09-27 04:10:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:10:56.982241 | orchestrator | 2025-09-27 04:10:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:10:56.983864 | orchestrator | 2025-09-27 04:10:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:10:56.984050 | orchestrator | 2025-09-27 04:10:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:00.033227 | orchestrator | 2025-09-27 04:11:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:00.034009 | orchestrator | 2025-09-27 04:11:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:00.034184 | orchestrator | 2025-09-27 04:11:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:03.083753 | orchestrator | 2025-09-27 04:11:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:03.085004 | orchestrator | 2025-09-27 04:11:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:03.085035 | orchestrator | 2025-09-27 04:11:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:06.130395 | orchestrator | 2025-09-27 04:11:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:06.132003 | orchestrator | 2025-09-27 04:11:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:06.132035 | orchestrator | 2025-09-27 04:11:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:09.179033 | orchestrator | 2025-09-27 04:11:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:09.180317 | orchestrator | 2025-09-27 04:11:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:09.180600 | orchestrator | 2025-09-27 04:11:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:12.225541 | orchestrator | 2025-09-27 04:11:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:12.226861 | orchestrator | 2025-09-27 04:11:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:12.226993 | orchestrator | 2025-09-27 04:11:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:15.273299 | orchestrator | 2025-09-27 04:11:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:15.274522 | orchestrator | 2025-09-27 04:11:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:15.274554 | orchestrator | 2025-09-27 04:11:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:18.320662 | orchestrator | 2025-09-27 04:11:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:18.322313 | orchestrator | 2025-09-27 04:11:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:18.322443 | orchestrator | 2025-09-27 04:11:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:21.367459 | orchestrator | 2025-09-27 04:11:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:21.369009 | orchestrator | 2025-09-27 04:11:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:21.369040 | orchestrator | 2025-09-27 04:11:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:24.412275 | orchestrator | 2025-09-27 04:11:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:24.414411 | orchestrator | 2025-09-27 04:11:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:24.414920 | orchestrator | 2025-09-27 04:11:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:27.459932 | orchestrator | 2025-09-27 04:11:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:27.461961 | orchestrator | 2025-09-27 04:11:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:27.461993 | orchestrator | 2025-09-27 04:11:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:30.513272 | orchestrator | 2025-09-27 04:11:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:30.515987 | orchestrator | 2025-09-27 04:11:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:30.516295 | orchestrator | 2025-09-27 04:11:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:33.567519 | orchestrator | 2025-09-27 04:11:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:33.570277 | orchestrator | 2025-09-27 04:11:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:33.570381 | orchestrator | 2025-09-27 04:11:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:36.616370 | orchestrator | 2025-09-27 04:11:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:36.618498 | orchestrator | 2025-09-27 04:11:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:36.618547 | orchestrator | 2025-09-27 04:11:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:39.668462 | orchestrator | 2025-09-27 04:11:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:39.670569 | orchestrator | 2025-09-27 04:11:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:39.670744 | orchestrator | 2025-09-27 04:11:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:42.724370 | orchestrator | 2025-09-27 04:11:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:42.725605 | orchestrator | 2025-09-27 04:11:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:42.725636 | orchestrator | 2025-09-27 04:11:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:45.764706 | orchestrator | 2025-09-27 04:11:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:45.766242 | orchestrator | 2025-09-27 04:11:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:45.766357 | orchestrator | 2025-09-27 04:11:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:48.817350 | orchestrator | 2025-09-27 04:11:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:48.818766 | orchestrator | 2025-09-27 04:11:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:48.818898 | orchestrator | 2025-09-27 04:11:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:51.863595 | orchestrator | 2025-09-27 04:11:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:51.864769 | orchestrator | 2025-09-27 04:11:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:51.864857 | orchestrator | 2025-09-27 04:11:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:54.911611 | orchestrator | 2025-09-27 04:11:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:54.914422 | orchestrator | 2025-09-27 04:11:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:54.914472 | orchestrator | 2025-09-27 04:11:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:11:57.962286 | orchestrator | 2025-09-27 04:11:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:11:57.963934 | orchestrator | 2025-09-27 04:11:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:11:57.963966 | orchestrator | 2025-09-27 04:11:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:01.009530 | orchestrator | 2025-09-27 04:12:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:01.010942 | orchestrator | 2025-09-27 04:12:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:01.011031 | orchestrator | 2025-09-27 04:12:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:04.056996 | orchestrator | 2025-09-27 04:12:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:04.058363 | orchestrator | 2025-09-27 04:12:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:04.058392 | orchestrator | 2025-09-27 04:12:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:07.103422 | orchestrator | 2025-09-27 04:12:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:07.104591 | orchestrator | 2025-09-27 04:12:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:07.104753 | orchestrator | 2025-09-27 04:12:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:10.149565 | orchestrator | 2025-09-27 04:12:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:10.152024 | orchestrator | 2025-09-27 04:12:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:10.152447 | orchestrator | 2025-09-27 04:12:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:13.200084 | orchestrator | 2025-09-27 04:12:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:13.202486 | orchestrator | 2025-09-27 04:12:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:13.202512 | orchestrator | 2025-09-27 04:12:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:16.244736 | orchestrator | 2025-09-27 04:12:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:16.246843 | orchestrator | 2025-09-27 04:12:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:16.246929 | orchestrator | 2025-09-27 04:12:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:19.299893 | orchestrator | 2025-09-27 04:12:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:19.300789 | orchestrator | 2025-09-27 04:12:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:19.300849 | orchestrator | 2025-09-27 04:12:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:22.338189 | orchestrator | 2025-09-27 04:12:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:22.340798 | orchestrator | 2025-09-27 04:12:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:22.340931 | orchestrator | 2025-09-27 04:12:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:25.387715 | orchestrator | 2025-09-27 04:12:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:25.389221 | orchestrator | 2025-09-27 04:12:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:25.389276 | orchestrator | 2025-09-27 04:12:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:28.433542 | orchestrator | 2025-09-27 04:12:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:28.434528 | orchestrator | 2025-09-27 04:12:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:28.434561 | orchestrator | 2025-09-27 04:12:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:31.477221 | orchestrator | 2025-09-27 04:12:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:31.478801 | orchestrator | 2025-09-27 04:12:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:31.478848 | orchestrator | 2025-09-27 04:12:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:34.530666 | orchestrator | 2025-09-27 04:12:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:34.532260 | orchestrator | 2025-09-27 04:12:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:34.532367 | orchestrator | 2025-09-27 04:12:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:37.577463 | orchestrator | 2025-09-27 04:12:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:37.578525 | orchestrator | 2025-09-27 04:12:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:37.578909 | orchestrator | 2025-09-27 04:12:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:40.625498 | orchestrator | 2025-09-27 04:12:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:40.626940 | orchestrator | 2025-09-27 04:12:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:40.627142 | orchestrator | 2025-09-27 04:12:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:43.676325 | orchestrator | 2025-09-27 04:12:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:43.678281 | orchestrator | 2025-09-27 04:12:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:43.678312 | orchestrator | 2025-09-27 04:12:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:46.724641 | orchestrator | 2025-09-27 04:12:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:46.726982 | orchestrator | 2025-09-27 04:12:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:46.727345 | orchestrator | 2025-09-27 04:12:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:49.775310 | orchestrator | 2025-09-27 04:12:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:49.777807 | orchestrator | 2025-09-27 04:12:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:49.777891 | orchestrator | 2025-09-27 04:12:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:52.822270 | orchestrator | 2025-09-27 04:12:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:52.823644 | orchestrator | 2025-09-27 04:12:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:52.823756 | orchestrator | 2025-09-27 04:12:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:55.870766 | orchestrator | 2025-09-27 04:12:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:55.871552 | orchestrator | 2025-09-27 04:12:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:55.871580 | orchestrator | 2025-09-27 04:12:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:12:58.918992 | orchestrator | 2025-09-27 04:12:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:12:58.920840 | orchestrator | 2025-09-27 04:12:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:12:58.920920 | orchestrator | 2025-09-27 04:12:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:01.970545 | orchestrator | 2025-09-27 04:13:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:01.972746 | orchestrator | 2025-09-27 04:13:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:01.973279 | orchestrator | 2025-09-27 04:13:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:05.034295 | orchestrator | 2025-09-27 04:13:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:05.035513 | orchestrator | 2025-09-27 04:13:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:05.035544 | orchestrator | 2025-09-27 04:13:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:08.078557 | orchestrator | 2025-09-27 04:13:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:08.079765 | orchestrator | 2025-09-27 04:13:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:08.080091 | orchestrator | 2025-09-27 04:13:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:11.127871 | orchestrator | 2025-09-27 04:13:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:11.129416 | orchestrator | 2025-09-27 04:13:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:11.129449 | orchestrator | 2025-09-27 04:13:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:14.174899 | orchestrator | 2025-09-27 04:13:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:14.176357 | orchestrator | 2025-09-27 04:13:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:14.176396 | orchestrator | 2025-09-27 04:13:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:17.221419 | orchestrator | 2025-09-27 04:13:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:17.223325 | orchestrator | 2025-09-27 04:13:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:17.223359 | orchestrator | 2025-09-27 04:13:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:20.280976 | orchestrator | 2025-09-27 04:13:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:20.282739 | orchestrator | 2025-09-27 04:13:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:20.283192 | orchestrator | 2025-09-27 04:13:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:23.337216 | orchestrator | 2025-09-27 04:13:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:23.340177 | orchestrator | 2025-09-27 04:13:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:23.340209 | orchestrator | 2025-09-27 04:13:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:26.391249 | orchestrator | 2025-09-27 04:13:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:26.392751 | orchestrator | 2025-09-27 04:13:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:26.392783 | orchestrator | 2025-09-27 04:13:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:29.444085 | orchestrator | 2025-09-27 04:13:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:29.445216 | orchestrator | 2025-09-27 04:13:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:29.445393 | orchestrator | 2025-09-27 04:13:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:32.490128 | orchestrator | 2025-09-27 04:13:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:32.491608 | orchestrator | 2025-09-27 04:13:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:32.491641 | orchestrator | 2025-09-27 04:13:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:35.543027 | orchestrator | 2025-09-27 04:13:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:35.545132 | orchestrator | 2025-09-27 04:13:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:35.545245 | orchestrator | 2025-09-27 04:13:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:38.593305 | orchestrator | 2025-09-27 04:13:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:38.595668 | orchestrator | 2025-09-27 04:13:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:38.595789 | orchestrator | 2025-09-27 04:13:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:41.649706 | orchestrator | 2025-09-27 04:13:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:41.651489 | orchestrator | 2025-09-27 04:13:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:41.651529 | orchestrator | 2025-09-27 04:13:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:44.700437 | orchestrator | 2025-09-27 04:13:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:44.701563 | orchestrator | 2025-09-27 04:13:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:44.701694 | orchestrator | 2025-09-27 04:13:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:47.750380 | orchestrator | 2025-09-27 04:13:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:47.751785 | orchestrator | 2025-09-27 04:13:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:47.751890 | orchestrator | 2025-09-27 04:13:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:50.804493 | orchestrator | 2025-09-27 04:13:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:50.806530 | orchestrator | 2025-09-27 04:13:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:50.806566 | orchestrator | 2025-09-27 04:13:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:53.852350 | orchestrator | 2025-09-27 04:13:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:53.854074 | orchestrator | 2025-09-27 04:13:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:53.854265 | orchestrator | 2025-09-27 04:13:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:56.896368 | orchestrator | 2025-09-27 04:13:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:56.898215 | orchestrator | 2025-09-27 04:13:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:56.898565 | orchestrator | 2025-09-27 04:13:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:13:59.946927 | orchestrator | 2025-09-27 04:13:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:13:59.947617 | orchestrator | 2025-09-27 04:13:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:13:59.947705 | orchestrator | 2025-09-27 04:13:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:02.995534 | orchestrator | 2025-09-27 04:14:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:02.996550 | orchestrator | 2025-09-27 04:14:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:02.996584 | orchestrator | 2025-09-27 04:14:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:06.055295 | orchestrator | 2025-09-27 04:14:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:06.057139 | orchestrator | 2025-09-27 04:14:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:06.057170 | orchestrator | 2025-09-27 04:14:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:09.101150 | orchestrator | 2025-09-27 04:14:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:09.103237 | orchestrator | 2025-09-27 04:14:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:09.103318 | orchestrator | 2025-09-27 04:14:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:12.153461 | orchestrator | 2025-09-27 04:14:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:12.155364 | orchestrator | 2025-09-27 04:14:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:12.155442 | orchestrator | 2025-09-27 04:14:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:15.200161 | orchestrator | 2025-09-27 04:14:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:15.201602 | orchestrator | 2025-09-27 04:14:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:15.201639 | orchestrator | 2025-09-27 04:14:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:18.248495 | orchestrator | 2025-09-27 04:14:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:18.250415 | orchestrator | 2025-09-27 04:14:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:18.250450 | orchestrator | 2025-09-27 04:14:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:21.302003 | orchestrator | 2025-09-27 04:14:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:21.303683 | orchestrator | 2025-09-27 04:14:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:21.303783 | orchestrator | 2025-09-27 04:14:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:24.349012 | orchestrator | 2025-09-27 04:14:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:24.350175 | orchestrator | 2025-09-27 04:14:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:24.350209 | orchestrator | 2025-09-27 04:14:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:27.393264 | orchestrator | 2025-09-27 04:14:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:27.395024 | orchestrator | 2025-09-27 04:14:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:27.395063 | orchestrator | 2025-09-27 04:14:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:30.438983 | orchestrator | 2025-09-27 04:14:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:30.440376 | orchestrator | 2025-09-27 04:14:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:30.440802 | orchestrator | 2025-09-27 04:14:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:33.482204 | orchestrator | 2025-09-27 04:14:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:33.484326 | orchestrator | 2025-09-27 04:14:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:33.484407 | orchestrator | 2025-09-27 04:14:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:36.537216 | orchestrator | 2025-09-27 04:14:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:36.538535 | orchestrator | 2025-09-27 04:14:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:36.538615 | orchestrator | 2025-09-27 04:14:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:39.590456 | orchestrator | 2025-09-27 04:14:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:39.592866 | orchestrator | 2025-09-27 04:14:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:39.592897 | orchestrator | 2025-09-27 04:14:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:42.636267 | orchestrator | 2025-09-27 04:14:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:42.639903 | orchestrator | 2025-09-27 04:14:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:42.639953 | orchestrator | 2025-09-27 04:14:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:45.685694 | orchestrator | 2025-09-27 04:14:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:45.687388 | orchestrator | 2025-09-27 04:14:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:45.687421 | orchestrator | 2025-09-27 04:14:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:48.735720 | orchestrator | 2025-09-27 04:14:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:48.736781 | orchestrator | 2025-09-27 04:14:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:48.737082 | orchestrator | 2025-09-27 04:14:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:51.781484 | orchestrator | 2025-09-27 04:14:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:51.783592 | orchestrator | 2025-09-27 04:14:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:51.783675 | orchestrator | 2025-09-27 04:14:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:54.831758 | orchestrator | 2025-09-27 04:14:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:54.833298 | orchestrator | 2025-09-27 04:14:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:54.833334 | orchestrator | 2025-09-27 04:14:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:14:57.881329 | orchestrator | 2025-09-27 04:14:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:14:57.883385 | orchestrator | 2025-09-27 04:14:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:14:57.883424 | orchestrator | 2025-09-27 04:14:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:00.930172 | orchestrator | 2025-09-27 04:15:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:00.931624 | orchestrator | 2025-09-27 04:15:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:00.931966 | orchestrator | 2025-09-27 04:15:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:03.976458 | orchestrator | 2025-09-27 04:15:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:03.978337 | orchestrator | 2025-09-27 04:15:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:03.978374 | orchestrator | 2025-09-27 04:15:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:07.027178 | orchestrator | 2025-09-27 04:15:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:07.031138 | orchestrator | 2025-09-27 04:15:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:07.031196 | orchestrator | 2025-09-27 04:15:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:10.078611 | orchestrator | 2025-09-27 04:15:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:10.081153 | orchestrator | 2025-09-27 04:15:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:10.081182 | orchestrator | 2025-09-27 04:15:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:13.128674 | orchestrator | 2025-09-27 04:15:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:13.129367 | orchestrator | 2025-09-27 04:15:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:13.129635 | orchestrator | 2025-09-27 04:15:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:16.174881 | orchestrator | 2025-09-27 04:15:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:16.176066 | orchestrator | 2025-09-27 04:15:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:16.176443 | orchestrator | 2025-09-27 04:15:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:19.222425 | orchestrator | 2025-09-27 04:15:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:19.225042 | orchestrator | 2025-09-27 04:15:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:19.225084 | orchestrator | 2025-09-27 04:15:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:22.275607 | orchestrator | 2025-09-27 04:15:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:22.276748 | orchestrator | 2025-09-27 04:15:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:22.276766 | orchestrator | 2025-09-27 04:15:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:25.322129 | orchestrator | 2025-09-27 04:15:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:25.324061 | orchestrator | 2025-09-27 04:15:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:25.324096 | orchestrator | 2025-09-27 04:15:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:28.369222 | orchestrator | 2025-09-27 04:15:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:28.370984 | orchestrator | 2025-09-27 04:15:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:28.371026 | orchestrator | 2025-09-27 04:15:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:31.413480 | orchestrator | 2025-09-27 04:15:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:31.415145 | orchestrator | 2025-09-27 04:15:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:31.415254 | orchestrator | 2025-09-27 04:15:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:34.464600 | orchestrator | 2025-09-27 04:15:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:34.466004 | orchestrator | 2025-09-27 04:15:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:34.466091 | orchestrator | 2025-09-27 04:15:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:37.519084 | orchestrator | 2025-09-27 04:15:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:37.520471 | orchestrator | 2025-09-27 04:15:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:37.520511 | orchestrator | 2025-09-27 04:15:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:40.570106 | orchestrator | 2025-09-27 04:15:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:40.573837 | orchestrator | 2025-09-27 04:15:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:40.573871 | orchestrator | 2025-09-27 04:15:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:43.621653 | orchestrator | 2025-09-27 04:15:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:43.623169 | orchestrator | 2025-09-27 04:15:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:43.623211 | orchestrator | 2025-09-27 04:15:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:46.664249 | orchestrator | 2025-09-27 04:15:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:46.666984 | orchestrator | 2025-09-27 04:15:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:46.667065 | orchestrator | 2025-09-27 04:15:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:49.716351 | orchestrator | 2025-09-27 04:15:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:49.717380 | orchestrator | 2025-09-27 04:15:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:49.717411 | orchestrator | 2025-09-27 04:15:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:52.760572 | orchestrator | 2025-09-27 04:15:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:52.762554 | orchestrator | 2025-09-27 04:15:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:52.762587 | orchestrator | 2025-09-27 04:15:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:55.810349 | orchestrator | 2025-09-27 04:15:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:55.811266 | orchestrator | 2025-09-27 04:15:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:55.811356 | orchestrator | 2025-09-27 04:15:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:15:58.859736 | orchestrator | 2025-09-27 04:15:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:15:58.861427 | orchestrator | 2025-09-27 04:15:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:15:58.861463 | orchestrator | 2025-09-27 04:15:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:01.906622 | orchestrator | 2025-09-27 04:16:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:01.908231 | orchestrator | 2025-09-27 04:16:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:01.908351 | orchestrator | 2025-09-27 04:16:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:04.954264 | orchestrator | 2025-09-27 04:16:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:04.956053 | orchestrator | 2025-09-27 04:16:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:04.956086 | orchestrator | 2025-09-27 04:16:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:08.006654 | orchestrator | 2025-09-27 04:16:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:08.008755 | orchestrator | 2025-09-27 04:16:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:08.008934 | orchestrator | 2025-09-27 04:16:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:11.058397 | orchestrator | 2025-09-27 04:16:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:11.059329 | orchestrator | 2025-09-27 04:16:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:11.059387 | orchestrator | 2025-09-27 04:16:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:14.104586 | orchestrator | 2025-09-27 04:16:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:14.105900 | orchestrator | 2025-09-27 04:16:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:14.105933 | orchestrator | 2025-09-27 04:16:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:17.147912 | orchestrator | 2025-09-27 04:16:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:17.149182 | orchestrator | 2025-09-27 04:16:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:17.149212 | orchestrator | 2025-09-27 04:16:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:20.196465 | orchestrator | 2025-09-27 04:16:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:20.198087 | orchestrator | 2025-09-27 04:16:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:20.198215 | orchestrator | 2025-09-27 04:16:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:23.249208 | orchestrator | 2025-09-27 04:16:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:23.250698 | orchestrator | 2025-09-27 04:16:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:23.250729 | orchestrator | 2025-09-27 04:16:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:26.301211 | orchestrator | 2025-09-27 04:16:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:26.301974 | orchestrator | 2025-09-27 04:16:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:26.302009 | orchestrator | 2025-09-27 04:16:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:29.357211 | orchestrator | 2025-09-27 04:16:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:29.358976 | orchestrator | 2025-09-27 04:16:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:29.359011 | orchestrator | 2025-09-27 04:16:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:32.403924 | orchestrator | 2025-09-27 04:16:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:32.405354 | orchestrator | 2025-09-27 04:16:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:32.405563 | orchestrator | 2025-09-27 04:16:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:35.449376 | orchestrator | 2025-09-27 04:16:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:35.449877 | orchestrator | 2025-09-27 04:16:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:35.449939 | orchestrator | 2025-09-27 04:16:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:38.493756 | orchestrator | 2025-09-27 04:16:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:38.496486 | orchestrator | 2025-09-27 04:16:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:38.496517 | orchestrator | 2025-09-27 04:16:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:41.546238 | orchestrator | 2025-09-27 04:16:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:41.548626 | orchestrator | 2025-09-27 04:16:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:41.548657 | orchestrator | 2025-09-27 04:16:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:44.594323 | orchestrator | 2025-09-27 04:16:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:44.595730 | orchestrator | 2025-09-27 04:16:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:44.595898 | orchestrator | 2025-09-27 04:16:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:47.643053 | orchestrator | 2025-09-27 04:16:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:47.644654 | orchestrator | 2025-09-27 04:16:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:47.644775 | orchestrator | 2025-09-27 04:16:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:50.692549 | orchestrator | 2025-09-27 04:16:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:50.694580 | orchestrator | 2025-09-27 04:16:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:50.694854 | orchestrator | 2025-09-27 04:16:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:53.738351 | orchestrator | 2025-09-27 04:16:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:53.739825 | orchestrator | 2025-09-27 04:16:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:53.740045 | orchestrator | 2025-09-27 04:16:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:56.780875 | orchestrator | 2025-09-27 04:16:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:56.782923 | orchestrator | 2025-09-27 04:16:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:56.782956 | orchestrator | 2025-09-27 04:16:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:16:59.825647 | orchestrator | 2025-09-27 04:16:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:16:59.826681 | orchestrator | 2025-09-27 04:16:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:16:59.826717 | orchestrator | 2025-09-27 04:16:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:02.867745 | orchestrator | 2025-09-27 04:17:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:02.869050 | orchestrator | 2025-09-27 04:17:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:02.869081 | orchestrator | 2025-09-27 04:17:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:05.916528 | orchestrator | 2025-09-27 04:17:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:05.918323 | orchestrator | 2025-09-27 04:17:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:05.918515 | orchestrator | 2025-09-27 04:17:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:08.963916 | orchestrator | 2025-09-27 04:17:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:08.965153 | orchestrator | 2025-09-27 04:17:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:08.965269 | orchestrator | 2025-09-27 04:17:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:12.020198 | orchestrator | 2025-09-27 04:17:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:12.024120 | orchestrator | 2025-09-27 04:17:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:12.024505 | orchestrator | 2025-09-27 04:17:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:15.066530 | orchestrator | 2025-09-27 04:17:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:15.067403 | orchestrator | 2025-09-27 04:17:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:15.067529 | orchestrator | 2025-09-27 04:17:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:18.116970 | orchestrator | 2025-09-27 04:17:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:18.118428 | orchestrator | 2025-09-27 04:17:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:18.118524 | orchestrator | 2025-09-27 04:17:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:21.161562 | orchestrator | 2025-09-27 04:17:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:21.163830 | orchestrator | 2025-09-27 04:17:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:21.163862 | orchestrator | 2025-09-27 04:17:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:24.204551 | orchestrator | 2025-09-27 04:17:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:24.205914 | orchestrator | 2025-09-27 04:17:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:24.206082 | orchestrator | 2025-09-27 04:17:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:27.252266 | orchestrator | 2025-09-27 04:17:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:27.253150 | orchestrator | 2025-09-27 04:17:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:27.253183 | orchestrator | 2025-09-27 04:17:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:30.299585 | orchestrator | 2025-09-27 04:17:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:30.301210 | orchestrator | 2025-09-27 04:17:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:30.301245 | orchestrator | 2025-09-27 04:17:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:33.347485 | orchestrator | 2025-09-27 04:17:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:33.349548 | orchestrator | 2025-09-27 04:17:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:33.349582 | orchestrator | 2025-09-27 04:17:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:36.398371 | orchestrator | 2025-09-27 04:17:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:36.399686 | orchestrator | 2025-09-27 04:17:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:36.399806 | orchestrator | 2025-09-27 04:17:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:39.447652 | orchestrator | 2025-09-27 04:17:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:39.450123 | orchestrator | 2025-09-27 04:17:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:39.450173 | orchestrator | 2025-09-27 04:17:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:42.490634 | orchestrator | 2025-09-27 04:17:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:42.492188 | orchestrator | 2025-09-27 04:17:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:42.492301 | orchestrator | 2025-09-27 04:17:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:45.540603 | orchestrator | 2025-09-27 04:17:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:45.541684 | orchestrator | 2025-09-27 04:17:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:45.541716 | orchestrator | 2025-09-27 04:17:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:48.592470 | orchestrator | 2025-09-27 04:17:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:48.593626 | orchestrator | 2025-09-27 04:17:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:48.593658 | orchestrator | 2025-09-27 04:17:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:51.637862 | orchestrator | 2025-09-27 04:17:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:51.640007 | orchestrator | 2025-09-27 04:17:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:51.640038 | orchestrator | 2025-09-27 04:17:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:54.687895 | orchestrator | 2025-09-27 04:17:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:54.689376 | orchestrator | 2025-09-27 04:17:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:54.689406 | orchestrator | 2025-09-27 04:17:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:17:57.735461 | orchestrator | 2025-09-27 04:17:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:17:57.736864 | orchestrator | 2025-09-27 04:17:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:17:57.736897 | orchestrator | 2025-09-27 04:17:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:00.783184 | orchestrator | 2025-09-27 04:18:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:00.784917 | orchestrator | 2025-09-27 04:18:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:00.784949 | orchestrator | 2025-09-27 04:18:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:03.829349 | orchestrator | 2025-09-27 04:18:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:03.830978 | orchestrator | 2025-09-27 04:18:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:03.831045 | orchestrator | 2025-09-27 04:18:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:06.878406 | orchestrator | 2025-09-27 04:18:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:06.879989 | orchestrator | 2025-09-27 04:18:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:06.880019 | orchestrator | 2025-09-27 04:18:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:09.929883 | orchestrator | 2025-09-27 04:18:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:09.930959 | orchestrator | 2025-09-27 04:18:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:09.930990 | orchestrator | 2025-09-27 04:18:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:12.979091 | orchestrator | 2025-09-27 04:18:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:12.980285 | orchestrator | 2025-09-27 04:18:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:12.980612 | orchestrator | 2025-09-27 04:18:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:16.027883 | orchestrator | 2025-09-27 04:18:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:16.029684 | orchestrator | 2025-09-27 04:18:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:16.029714 | orchestrator | 2025-09-27 04:18:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:19.077544 | orchestrator | 2025-09-27 04:18:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:19.079378 | orchestrator | 2025-09-27 04:18:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:19.079409 | orchestrator | 2025-09-27 04:18:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:22.130406 | orchestrator | 2025-09-27 04:18:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:22.132204 | orchestrator | 2025-09-27 04:18:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:22.132235 | orchestrator | 2025-09-27 04:18:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:25.180264 | orchestrator | 2025-09-27 04:18:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:25.182257 | orchestrator | 2025-09-27 04:18:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:25.182474 | orchestrator | 2025-09-27 04:18:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:28.230585 | orchestrator | 2025-09-27 04:18:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:28.232680 | orchestrator | 2025-09-27 04:18:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:28.232714 | orchestrator | 2025-09-27 04:18:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:31.286304 | orchestrator | 2025-09-27 04:18:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:31.287575 | orchestrator | 2025-09-27 04:18:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:31.287904 | orchestrator | 2025-09-27 04:18:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:34.330467 | orchestrator | 2025-09-27 04:18:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:34.331889 | orchestrator | 2025-09-27 04:18:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:34.331959 | orchestrator | 2025-09-27 04:18:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:37.378285 | orchestrator | 2025-09-27 04:18:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:37.380009 | orchestrator | 2025-09-27 04:18:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:37.380086 | orchestrator | 2025-09-27 04:18:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:40.419331 | orchestrator | 2025-09-27 04:18:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:40.421076 | orchestrator | 2025-09-27 04:18:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:40.421107 | orchestrator | 2025-09-27 04:18:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:43.469674 | orchestrator | 2025-09-27 04:18:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:43.470822 | orchestrator | 2025-09-27 04:18:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:43.470918 | orchestrator | 2025-09-27 04:18:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:46.525647 | orchestrator | 2025-09-27 04:18:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:46.528164 | orchestrator | 2025-09-27 04:18:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:46.528466 | orchestrator | 2025-09-27 04:18:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:49.579308 | orchestrator | 2025-09-27 04:18:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:49.583312 | orchestrator | 2025-09-27 04:18:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:49.583459 | orchestrator | 2025-09-27 04:18:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:52.630373 | orchestrator | 2025-09-27 04:18:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:52.631607 | orchestrator | 2025-09-27 04:18:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:52.632001 | orchestrator | 2025-09-27 04:18:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:55.678000 | orchestrator | 2025-09-27 04:18:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:55.678814 | orchestrator | 2025-09-27 04:18:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:55.678843 | orchestrator | 2025-09-27 04:18:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:18:58.716448 | orchestrator | 2025-09-27 04:18:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:18:58.718238 | orchestrator | 2025-09-27 04:18:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:18:58.718271 | orchestrator | 2025-09-27 04:18:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:01.774133 | orchestrator | 2025-09-27 04:19:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:01.775693 | orchestrator | 2025-09-27 04:19:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:01.775934 | orchestrator | 2025-09-27 04:19:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:04.823312 | orchestrator | 2025-09-27 04:19:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:04.824635 | orchestrator | 2025-09-27 04:19:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:04.824838 | orchestrator | 2025-09-27 04:19:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:07.875945 | orchestrator | 2025-09-27 04:19:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:07.877698 | orchestrator | 2025-09-27 04:19:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:07.877807 | orchestrator | 2025-09-27 04:19:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:10.922423 | orchestrator | 2025-09-27 04:19:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:10.923932 | orchestrator | 2025-09-27 04:19:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:10.923969 | orchestrator | 2025-09-27 04:19:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:13.972491 | orchestrator | 2025-09-27 04:19:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:13.975051 | orchestrator | 2025-09-27 04:19:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:13.975386 | orchestrator | 2025-09-27 04:19:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:17.020515 | orchestrator | 2025-09-27 04:19:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:17.021992 | orchestrator | 2025-09-27 04:19:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:17.022081 | orchestrator | 2025-09-27 04:19:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:20.069186 | orchestrator | 2025-09-27 04:19:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:20.070436 | orchestrator | 2025-09-27 04:19:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:20.070519 | orchestrator | 2025-09-27 04:19:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:23.114882 | orchestrator | 2025-09-27 04:19:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:23.117527 | orchestrator | 2025-09-27 04:19:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:23.117559 | orchestrator | 2025-09-27 04:19:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:26.158650 | orchestrator | 2025-09-27 04:19:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:26.160533 | orchestrator | 2025-09-27 04:19:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:26.160599 | orchestrator | 2025-09-27 04:19:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:29.207993 | orchestrator | 2025-09-27 04:19:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:29.209967 | orchestrator | 2025-09-27 04:19:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:29.210101 | orchestrator | 2025-09-27 04:19:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:32.262318 | orchestrator | 2025-09-27 04:19:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:32.263334 | orchestrator | 2025-09-27 04:19:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:32.263363 | orchestrator | 2025-09-27 04:19:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:35.307259 | orchestrator | 2025-09-27 04:19:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:35.308849 | orchestrator | 2025-09-27 04:19:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:35.308878 | orchestrator | 2025-09-27 04:19:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:38.348687 | orchestrator | 2025-09-27 04:19:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:38.349485 | orchestrator | 2025-09-27 04:19:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:38.349515 | orchestrator | 2025-09-27 04:19:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:41.394295 | orchestrator | 2025-09-27 04:19:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:41.396478 | orchestrator | 2025-09-27 04:19:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:41.396532 | orchestrator | 2025-09-27 04:19:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:44.451060 | orchestrator | 2025-09-27 04:19:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:44.451798 | orchestrator | 2025-09-27 04:19:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:44.452204 | orchestrator | 2025-09-27 04:19:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:47.498850 | orchestrator | 2025-09-27 04:19:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:47.499874 | orchestrator | 2025-09-27 04:19:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:47.499957 | orchestrator | 2025-09-27 04:19:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:50.546246 | orchestrator | 2025-09-27 04:19:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:50.547209 | orchestrator | 2025-09-27 04:19:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:50.547238 | orchestrator | 2025-09-27 04:19:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:53.591366 | orchestrator | 2025-09-27 04:19:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:53.593467 | orchestrator | 2025-09-27 04:19:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:53.593727 | orchestrator | 2025-09-27 04:19:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:56.640366 | orchestrator | 2025-09-27 04:19:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:56.642196 | orchestrator | 2025-09-27 04:19:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:56.642228 | orchestrator | 2025-09-27 04:19:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:19:59.690608 | orchestrator | 2025-09-27 04:19:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:19:59.692589 | orchestrator | 2025-09-27 04:19:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:19:59.692821 | orchestrator | 2025-09-27 04:19:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:02.738511 | orchestrator | 2025-09-27 04:20:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:02.739341 | orchestrator | 2025-09-27 04:20:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:02.739402 | orchestrator | 2025-09-27 04:20:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:05.787367 | orchestrator | 2025-09-27 04:20:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:05.788920 | orchestrator | 2025-09-27 04:20:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:05.788948 | orchestrator | 2025-09-27 04:20:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:08.833483 | orchestrator | 2025-09-27 04:20:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:08.834400 | orchestrator | 2025-09-27 04:20:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:08.834430 | orchestrator | 2025-09-27 04:20:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:11.882291 | orchestrator | 2025-09-27 04:20:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:11.884545 | orchestrator | 2025-09-27 04:20:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:11.884627 | orchestrator | 2025-09-27 04:20:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:14.928854 | orchestrator | 2025-09-27 04:20:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:14.930302 | orchestrator | 2025-09-27 04:20:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:14.930334 | orchestrator | 2025-09-27 04:20:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:17.980808 | orchestrator | 2025-09-27 04:20:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:17.983121 | orchestrator | 2025-09-27 04:20:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:17.983220 | orchestrator | 2025-09-27 04:20:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:21.031389 | orchestrator | 2025-09-27 04:20:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:21.032609 | orchestrator | 2025-09-27 04:20:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:21.032836 | orchestrator | 2025-09-27 04:20:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:24.074988 | orchestrator | 2025-09-27 04:20:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:24.076348 | orchestrator | 2025-09-27 04:20:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:24.076373 | orchestrator | 2025-09-27 04:20:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:27.125061 | orchestrator | 2025-09-27 04:20:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:27.127545 | orchestrator | 2025-09-27 04:20:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:27.127646 | orchestrator | 2025-09-27 04:20:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:30.171529 | orchestrator | 2025-09-27 04:20:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:30.173410 | orchestrator | 2025-09-27 04:20:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:30.173440 | orchestrator | 2025-09-27 04:20:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:33.229154 | orchestrator | 2025-09-27 04:20:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:33.231410 | orchestrator | 2025-09-27 04:20:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:33.231440 | orchestrator | 2025-09-27 04:20:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:36.274940 | orchestrator | 2025-09-27 04:20:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:36.275785 | orchestrator | 2025-09-27 04:20:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:36.275818 | orchestrator | 2025-09-27 04:20:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:39.321109 | orchestrator | 2025-09-27 04:20:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:39.323122 | orchestrator | 2025-09-27 04:20:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:39.323453 | orchestrator | 2025-09-27 04:20:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:42.375316 | orchestrator | 2025-09-27 04:20:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:42.377537 | orchestrator | 2025-09-27 04:20:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:42.377699 | orchestrator | 2025-09-27 04:20:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:45.422481 | orchestrator | 2025-09-27 04:20:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:45.423642 | orchestrator | 2025-09-27 04:20:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:45.423654 | orchestrator | 2025-09-27 04:20:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:48.469010 | orchestrator | 2025-09-27 04:20:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:48.471185 | orchestrator | 2025-09-27 04:20:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:48.471321 | orchestrator | 2025-09-27 04:20:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:51.519584 | orchestrator | 2025-09-27 04:20:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:51.521709 | orchestrator | 2025-09-27 04:20:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:51.521738 | orchestrator | 2025-09-27 04:20:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:54.564610 | orchestrator | 2025-09-27 04:20:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:54.566242 | orchestrator | 2025-09-27 04:20:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:54.566277 | orchestrator | 2025-09-27 04:20:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:20:57.609567 | orchestrator | 2025-09-27 04:20:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:20:57.610743 | orchestrator | 2025-09-27 04:20:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:20:57.611004 | orchestrator | 2025-09-27 04:20:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:00.660599 | orchestrator | 2025-09-27 04:21:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:00.662254 | orchestrator | 2025-09-27 04:21:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:00.662276 | orchestrator | 2025-09-27 04:21:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:03.721108 | orchestrator | 2025-09-27 04:21:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:03.722425 | orchestrator | 2025-09-27 04:21:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:03.722456 | orchestrator | 2025-09-27 04:21:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:06.773837 | orchestrator | 2025-09-27 04:21:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:06.775203 | orchestrator | 2025-09-27 04:21:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:06.775234 | orchestrator | 2025-09-27 04:21:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:09.818163 | orchestrator | 2025-09-27 04:21:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:09.819531 | orchestrator | 2025-09-27 04:21:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:09.819559 | orchestrator | 2025-09-27 04:21:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:12.862143 | orchestrator | 2025-09-27 04:21:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:12.864367 | orchestrator | 2025-09-27 04:21:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:12.864556 | orchestrator | 2025-09-27 04:21:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:15.912835 | orchestrator | 2025-09-27 04:21:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:15.914649 | orchestrator | 2025-09-27 04:21:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:15.914703 | orchestrator | 2025-09-27 04:21:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:18.958980 | orchestrator | 2025-09-27 04:21:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:18.960364 | orchestrator | 2025-09-27 04:21:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:18.960392 | orchestrator | 2025-09-27 04:21:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:22.003627 | orchestrator | 2025-09-27 04:21:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:22.005472 | orchestrator | 2025-09-27 04:21:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:22.005728 | orchestrator | 2025-09-27 04:21:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:25.044640 | orchestrator | 2025-09-27 04:21:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:25.046974 | orchestrator | 2025-09-27 04:21:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:25.047125 | orchestrator | 2025-09-27 04:21:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:28.092962 | orchestrator | 2025-09-27 04:21:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:28.095771 | orchestrator | 2025-09-27 04:21:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:28.095809 | orchestrator | 2025-09-27 04:21:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:31.142984 | orchestrator | 2025-09-27 04:21:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:31.144203 | orchestrator | 2025-09-27 04:21:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:31.145129 | orchestrator | 2025-09-27 04:21:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:34.194254 | orchestrator | 2025-09-27 04:21:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:34.195010 | orchestrator | 2025-09-27 04:21:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:34.195040 | orchestrator | 2025-09-27 04:21:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:37.241192 | orchestrator | 2025-09-27 04:21:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:37.242196 | orchestrator | 2025-09-27 04:21:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:37.242230 | orchestrator | 2025-09-27 04:21:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:40.289446 | orchestrator | 2025-09-27 04:21:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:40.290449 | orchestrator | 2025-09-27 04:21:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:40.290479 | orchestrator | 2025-09-27 04:21:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:43.335230 | orchestrator | 2025-09-27 04:21:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:43.336641 | orchestrator | 2025-09-27 04:21:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:43.336878 | orchestrator | 2025-09-27 04:21:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:46.389419 | orchestrator | 2025-09-27 04:21:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:46.390959 | orchestrator | 2025-09-27 04:21:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:46.390986 | orchestrator | 2025-09-27 04:21:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:49.435779 | orchestrator | 2025-09-27 04:21:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:49.437285 | orchestrator | 2025-09-27 04:21:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:49.437385 | orchestrator | 2025-09-27 04:21:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:52.482360 | orchestrator | 2025-09-27 04:21:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:52.483896 | orchestrator | 2025-09-27 04:21:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:52.483925 | orchestrator | 2025-09-27 04:21:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:55.529139 | orchestrator | 2025-09-27 04:21:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:55.530806 | orchestrator | 2025-09-27 04:21:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:55.530835 | orchestrator | 2025-09-27 04:21:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:21:58.574765 | orchestrator | 2025-09-27 04:21:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:21:58.576241 | orchestrator | 2025-09-27 04:21:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:21:58.576266 | orchestrator | 2025-09-27 04:21:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:01.622965 | orchestrator | 2025-09-27 04:22:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:01.624396 | orchestrator | 2025-09-27 04:22:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:01.624474 | orchestrator | 2025-09-27 04:22:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:04.671138 | orchestrator | 2025-09-27 04:22:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:04.672948 | orchestrator | 2025-09-27 04:22:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:04.673059 | orchestrator | 2025-09-27 04:22:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:07.720924 | orchestrator | 2025-09-27 04:22:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:07.721827 | orchestrator | 2025-09-27 04:22:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:07.721861 | orchestrator | 2025-09-27 04:22:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:10.765485 | orchestrator | 2025-09-27 04:22:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:10.767546 | orchestrator | 2025-09-27 04:22:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:10.767579 | orchestrator | 2025-09-27 04:22:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:13.802584 | orchestrator | 2025-09-27 04:22:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:13.803578 | orchestrator | 2025-09-27 04:22:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:13.803784 | orchestrator | 2025-09-27 04:22:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:16.849922 | orchestrator | 2025-09-27 04:22:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:16.853586 | orchestrator | 2025-09-27 04:22:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:16.853691 | orchestrator | 2025-09-27 04:22:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:19.904312 | orchestrator | 2025-09-27 04:22:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:19.905985 | orchestrator | 2025-09-27 04:22:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:19.906172 | orchestrator | 2025-09-27 04:22:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:22.955452 | orchestrator | 2025-09-27 04:22:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:22.957443 | orchestrator | 2025-09-27 04:22:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:22.957696 | orchestrator | 2025-09-27 04:22:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:26.004165 | orchestrator | 2025-09-27 04:22:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:26.005492 | orchestrator | 2025-09-27 04:22:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:26.005524 | orchestrator | 2025-09-27 04:22:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:29.058339 | orchestrator | 2025-09-27 04:22:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:29.058445 | orchestrator | 2025-09-27 04:22:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:29.058461 | orchestrator | 2025-09-27 04:22:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:32.101165 | orchestrator | 2025-09-27 04:22:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:32.103157 | orchestrator | 2025-09-27 04:22:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:32.103242 | orchestrator | 2025-09-27 04:22:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:35.150384 | orchestrator | 2025-09-27 04:22:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:35.152214 | orchestrator | 2025-09-27 04:22:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:35.152298 | orchestrator | 2025-09-27 04:22:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:38.200776 | orchestrator | 2025-09-27 04:22:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:38.202168 | orchestrator | 2025-09-27 04:22:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:38.202197 | orchestrator | 2025-09-27 04:22:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:41.247240 | orchestrator | 2025-09-27 04:22:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:41.248788 | orchestrator | 2025-09-27 04:22:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:41.248873 | orchestrator | 2025-09-27 04:22:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:44.294616 | orchestrator | 2025-09-27 04:22:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:44.296277 | orchestrator | 2025-09-27 04:22:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:44.296360 | orchestrator | 2025-09-27 04:22:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:47.338096 | orchestrator | 2025-09-27 04:22:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:47.339783 | orchestrator | 2025-09-27 04:22:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:47.339803 | orchestrator | 2025-09-27 04:22:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:50.384041 | orchestrator | 2025-09-27 04:22:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:50.384686 | orchestrator | 2025-09-27 04:22:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:50.384718 | orchestrator | 2025-09-27 04:22:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:53.428733 | orchestrator | 2025-09-27 04:22:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:53.430739 | orchestrator | 2025-09-27 04:22:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:53.430828 | orchestrator | 2025-09-27 04:22:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:56.473283 | orchestrator | 2025-09-27 04:22:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:56.474145 | orchestrator | 2025-09-27 04:22:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:56.474560 | orchestrator | 2025-09-27 04:22:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:22:59.515148 | orchestrator | 2025-09-27 04:22:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:22:59.516795 | orchestrator | 2025-09-27 04:22:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:22:59.516862 | orchestrator | 2025-09-27 04:22:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:02.559156 | orchestrator | 2025-09-27 04:23:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:02.561155 | orchestrator | 2025-09-27 04:23:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:02.561243 | orchestrator | 2025-09-27 04:23:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:05.611409 | orchestrator | 2025-09-27 04:23:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:05.612935 | orchestrator | 2025-09-27 04:23:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:05.613006 | orchestrator | 2025-09-27 04:23:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:08.662495 | orchestrator | 2025-09-27 04:23:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:08.663809 | orchestrator | 2025-09-27 04:23:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:08.663841 | orchestrator | 2025-09-27 04:23:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:11.711596 | orchestrator | 2025-09-27 04:23:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:11.713422 | orchestrator | 2025-09-27 04:23:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:11.713978 | orchestrator | 2025-09-27 04:23:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:14.757246 | orchestrator | 2025-09-27 04:23:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:14.759274 | orchestrator | 2025-09-27 04:23:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:14.759304 | orchestrator | 2025-09-27 04:23:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:17.802848 | orchestrator | 2025-09-27 04:23:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:17.805066 | orchestrator | 2025-09-27 04:23:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:17.805156 | orchestrator | 2025-09-27 04:23:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:20.849280 | orchestrator | 2025-09-27 04:23:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:20.849969 | orchestrator | 2025-09-27 04:23:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:20.850116 | orchestrator | 2025-09-27 04:23:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:23.898857 | orchestrator | 2025-09-27 04:23:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:23.901425 | orchestrator | 2025-09-27 04:23:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:23.901472 | orchestrator | 2025-09-27 04:23:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:26.943674 | orchestrator | 2025-09-27 04:23:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:26.946145 | orchestrator | 2025-09-27 04:23:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:26.946178 | orchestrator | 2025-09-27 04:23:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:29.989214 | orchestrator | 2025-09-27 04:23:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:29.990799 | orchestrator | 2025-09-27 04:23:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:29.990833 | orchestrator | 2025-09-27 04:23:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:33.038496 | orchestrator | 2025-09-27 04:23:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:33.040240 | orchestrator | 2025-09-27 04:23:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:33.040262 | orchestrator | 2025-09-27 04:23:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:36.087323 | orchestrator | 2025-09-27 04:23:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:36.088901 | orchestrator | 2025-09-27 04:23:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:36.088929 | orchestrator | 2025-09-27 04:23:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:39.138849 | orchestrator | 2025-09-27 04:23:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:39.140685 | orchestrator | 2025-09-27 04:23:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:39.140769 | orchestrator | 2025-09-27 04:23:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:42.189337 | orchestrator | 2025-09-27 04:23:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:42.190779 | orchestrator | 2025-09-27 04:23:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:42.190912 | orchestrator | 2025-09-27 04:23:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:45.237916 | orchestrator | 2025-09-27 04:23:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:45.239986 | orchestrator | 2025-09-27 04:23:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:45.240018 | orchestrator | 2025-09-27 04:23:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:48.287941 | orchestrator | 2025-09-27 04:23:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:48.289337 | orchestrator | 2025-09-27 04:23:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:48.289577 | orchestrator | 2025-09-27 04:23:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:51.333230 | orchestrator | 2025-09-27 04:23:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:51.334518 | orchestrator | 2025-09-27 04:23:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:51.334550 | orchestrator | 2025-09-27 04:23:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:54.380097 | orchestrator | 2025-09-27 04:23:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:54.381517 | orchestrator | 2025-09-27 04:23:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:54.381691 | orchestrator | 2025-09-27 04:23:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:23:57.421916 | orchestrator | 2025-09-27 04:23:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:23:57.424551 | orchestrator | 2025-09-27 04:23:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:23:57.424703 | orchestrator | 2025-09-27 04:23:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:00.472474 | orchestrator | 2025-09-27 04:24:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:00.475069 | orchestrator | 2025-09-27 04:24:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:00.475820 | orchestrator | 2025-09-27 04:24:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:03.521190 | orchestrator | 2025-09-27 04:24:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:03.524201 | orchestrator | 2025-09-27 04:24:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:03.524242 | orchestrator | 2025-09-27 04:24:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:06.576324 | orchestrator | 2025-09-27 04:24:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:06.577786 | orchestrator | 2025-09-27 04:24:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:06.577816 | orchestrator | 2025-09-27 04:24:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:09.626300 | orchestrator | 2025-09-27 04:24:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:09.627732 | orchestrator | 2025-09-27 04:24:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:09.627762 | orchestrator | 2025-09-27 04:24:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:12.669433 | orchestrator | 2025-09-27 04:24:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:12.670764 | orchestrator | 2025-09-27 04:24:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:12.670851 | orchestrator | 2025-09-27 04:24:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:15.711887 | orchestrator | 2025-09-27 04:24:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:15.712586 | orchestrator | 2025-09-27 04:24:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:15.712875 | orchestrator | 2025-09-27 04:24:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:18.761124 | orchestrator | 2025-09-27 04:24:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:18.763004 | orchestrator | 2025-09-27 04:24:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:18.763374 | orchestrator | 2025-09-27 04:24:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:21.809402 | orchestrator | 2025-09-27 04:24:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:21.810799 | orchestrator | 2025-09-27 04:24:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:21.810878 | orchestrator | 2025-09-27 04:24:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:24.859737 | orchestrator | 2025-09-27 04:24:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:24.860834 | orchestrator | 2025-09-27 04:24:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:24.861007 | orchestrator | 2025-09-27 04:24:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:27.905109 | orchestrator | 2025-09-27 04:24:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:27.907255 | orchestrator | 2025-09-27 04:24:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:27.907437 | orchestrator | 2025-09-27 04:24:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:30.955597 | orchestrator | 2025-09-27 04:24:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:30.956854 | orchestrator | 2025-09-27 04:24:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:30.957376 | orchestrator | 2025-09-27 04:24:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:34.000739 | orchestrator | 2025-09-27 04:24:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:34.007134 | orchestrator | 2025-09-27 04:24:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:34.007188 | orchestrator | 2025-09-27 04:24:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:37.051789 | orchestrator | 2025-09-27 04:24:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:37.053008 | orchestrator | 2025-09-27 04:24:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:37.053036 | orchestrator | 2025-09-27 04:24:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:40.102164 | orchestrator | 2025-09-27 04:24:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:40.103254 | orchestrator | 2025-09-27 04:24:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:40.103373 | orchestrator | 2025-09-27 04:24:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:43.150693 | orchestrator | 2025-09-27 04:24:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:43.152338 | orchestrator | 2025-09-27 04:24:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:43.152371 | orchestrator | 2025-09-27 04:24:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:46.193083 | orchestrator | 2025-09-27 04:24:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:46.194390 | orchestrator | 2025-09-27 04:24:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:46.194419 | orchestrator | 2025-09-27 04:24:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:49.233079 | orchestrator | 2025-09-27 04:24:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:49.234960 | orchestrator | 2025-09-27 04:24:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:49.234996 | orchestrator | 2025-09-27 04:24:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:52.281787 | orchestrator | 2025-09-27 04:24:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:52.284559 | orchestrator | 2025-09-27 04:24:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:52.284590 | orchestrator | 2025-09-27 04:24:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:55.333570 | orchestrator | 2025-09-27 04:24:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:55.336584 | orchestrator | 2025-09-27 04:24:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:55.336664 | orchestrator | 2025-09-27 04:24:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:24:58.383526 | orchestrator | 2025-09-27 04:24:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:24:58.384770 | orchestrator | 2025-09-27 04:24:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:24:58.384805 | orchestrator | 2025-09-27 04:24:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:01.434900 | orchestrator | 2025-09-27 04:25:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:01.438880 | orchestrator | 2025-09-27 04:25:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:01.439924 | orchestrator | 2025-09-27 04:25:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:04.483985 | orchestrator | 2025-09-27 04:25:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:04.485200 | orchestrator | 2025-09-27 04:25:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:04.485239 | orchestrator | 2025-09-27 04:25:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:07.528541 | orchestrator | 2025-09-27 04:25:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:07.530633 | orchestrator | 2025-09-27 04:25:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:07.530660 | orchestrator | 2025-09-27 04:25:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:10.579904 | orchestrator | 2025-09-27 04:25:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:10.581722 | orchestrator | 2025-09-27 04:25:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:10.581862 | orchestrator | 2025-09-27 04:25:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:13.625096 | orchestrator | 2025-09-27 04:25:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:13.626554 | orchestrator | 2025-09-27 04:25:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:13.626588 | orchestrator | 2025-09-27 04:25:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:16.670562 | orchestrator | 2025-09-27 04:25:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:16.671359 | orchestrator | 2025-09-27 04:25:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:16.671392 | orchestrator | 2025-09-27 04:25:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:19.716237 | orchestrator | 2025-09-27 04:25:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:19.717689 | orchestrator | 2025-09-27 04:25:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:19.717720 | orchestrator | 2025-09-27 04:25:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:22.757907 | orchestrator | 2025-09-27 04:25:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:22.758812 | orchestrator | 2025-09-27 04:25:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:22.758941 | orchestrator | 2025-09-27 04:25:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:25.806237 | orchestrator | 2025-09-27 04:25:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:25.807443 | orchestrator | 2025-09-27 04:25:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:25.807709 | orchestrator | 2025-09-27 04:25:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:28.851659 | orchestrator | 2025-09-27 04:25:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:28.852987 | orchestrator | 2025-09-27 04:25:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:28.853157 | orchestrator | 2025-09-27 04:25:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:31.894687 | orchestrator | 2025-09-27 04:25:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:31.895832 | orchestrator | 2025-09-27 04:25:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:31.895869 | orchestrator | 2025-09-27 04:25:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:34.941021 | orchestrator | 2025-09-27 04:25:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:34.942700 | orchestrator | 2025-09-27 04:25:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:34.942732 | orchestrator | 2025-09-27 04:25:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:37.984355 | orchestrator | 2025-09-27 04:25:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:37.987349 | orchestrator | 2025-09-27 04:25:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:37.987377 | orchestrator | 2025-09-27 04:25:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:41.041849 | orchestrator | 2025-09-27 04:25:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:41.043313 | orchestrator | 2025-09-27 04:25:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:41.043342 | orchestrator | 2025-09-27 04:25:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:44.085638 | orchestrator | 2025-09-27 04:25:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:44.086964 | orchestrator | 2025-09-27 04:25:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:44.086990 | orchestrator | 2025-09-27 04:25:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:47.132927 | orchestrator | 2025-09-27 04:25:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:47.134222 | orchestrator | 2025-09-27 04:25:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:47.134253 | orchestrator | 2025-09-27 04:25:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:50.181989 | orchestrator | 2025-09-27 04:25:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:50.183348 | orchestrator | 2025-09-27 04:25:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:50.183380 | orchestrator | 2025-09-27 04:25:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:53.231947 | orchestrator | 2025-09-27 04:25:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:53.233875 | orchestrator | 2025-09-27 04:25:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:53.233911 | orchestrator | 2025-09-27 04:25:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:56.287135 | orchestrator | 2025-09-27 04:25:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:56.288693 | orchestrator | 2025-09-27 04:25:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:56.288754 | orchestrator | 2025-09-27 04:25:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:25:59.330659 | orchestrator | 2025-09-27 04:25:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:25:59.332100 | orchestrator | 2025-09-27 04:25:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:25:59.332126 | orchestrator | 2025-09-27 04:25:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:02.379137 | orchestrator | 2025-09-27 04:26:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:02.379983 | orchestrator | 2025-09-27 04:26:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:02.380170 | orchestrator | 2025-09-27 04:26:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:05.421743 | orchestrator | 2025-09-27 04:26:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:05.422818 | orchestrator | 2025-09-27 04:26:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:05.422850 | orchestrator | 2025-09-27 04:26:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:08.464934 | orchestrator | 2025-09-27 04:26:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:08.466777 | orchestrator | 2025-09-27 04:26:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:08.466814 | orchestrator | 2025-09-27 04:26:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:11.519096 | orchestrator | 2025-09-27 04:26:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:11.520299 | orchestrator | 2025-09-27 04:26:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:11.520402 | orchestrator | 2025-09-27 04:26:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:14.561180 | orchestrator | 2025-09-27 04:26:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:14.563148 | orchestrator | 2025-09-27 04:26:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:14.563215 | orchestrator | 2025-09-27 04:26:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:17.608988 | orchestrator | 2025-09-27 04:26:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:17.610922 | orchestrator | 2025-09-27 04:26:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:17.610944 | orchestrator | 2025-09-27 04:26:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:20.651094 | orchestrator | 2025-09-27 04:26:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:20.652899 | orchestrator | 2025-09-27 04:26:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:20.653031 | orchestrator | 2025-09-27 04:26:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:23.695324 | orchestrator | 2025-09-27 04:26:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:23.695739 | orchestrator | 2025-09-27 04:26:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:23.695770 | orchestrator | 2025-09-27 04:26:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:26.748831 | orchestrator | 2025-09-27 04:26:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:26.751379 | orchestrator | 2025-09-27 04:26:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:26.751411 | orchestrator | 2025-09-27 04:26:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:29.800073 | orchestrator | 2025-09-27 04:26:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:29.801746 | orchestrator | 2025-09-27 04:26:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:29.801824 | orchestrator | 2025-09-27 04:26:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:32.855192 | orchestrator | 2025-09-27 04:26:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:32.856524 | orchestrator | 2025-09-27 04:26:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:32.856553 | orchestrator | 2025-09-27 04:26:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:35.904046 | orchestrator | 2025-09-27 04:26:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:35.906083 | orchestrator | 2025-09-27 04:26:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:35.906227 | orchestrator | 2025-09-27 04:26:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:38.947907 | orchestrator | 2025-09-27 04:26:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:38.949733 | orchestrator | 2025-09-27 04:26:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:38.949763 | orchestrator | 2025-09-27 04:26:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:41.996356 | orchestrator | 2025-09-27 04:26:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:41.998425 | orchestrator | 2025-09-27 04:26:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:41.998456 | orchestrator | 2025-09-27 04:26:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:45.042819 | orchestrator | 2025-09-27 04:26:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:45.044355 | orchestrator | 2025-09-27 04:26:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:45.044730 | orchestrator | 2025-09-27 04:26:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:48.095751 | orchestrator | 2025-09-27 04:26:48 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:48.097866 | orchestrator | 2025-09-27 04:26:48 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:48.098190 | orchestrator | 2025-09-27 04:26:48 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:51.143844 | orchestrator | 2025-09-27 04:26:51 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:51.144912 | orchestrator | 2025-09-27 04:26:51 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:51.144952 | orchestrator | 2025-09-27 04:26:51 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:54.186687 | orchestrator | 2025-09-27 04:26:54 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:54.188986 | orchestrator | 2025-09-27 04:26:54 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:54.189020 | orchestrator | 2025-09-27 04:26:54 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:26:57.234977 | orchestrator | 2025-09-27 04:26:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:26:57.236347 | orchestrator | 2025-09-27 04:26:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:26:57.236459 | orchestrator | 2025-09-27 04:26:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:00.278664 | orchestrator | 2025-09-27 04:27:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:00.279943 | orchestrator | 2025-09-27 04:27:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:00.279982 | orchestrator | 2025-09-27 04:27:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:03.331179 | orchestrator | 2025-09-27 04:27:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:03.333548 | orchestrator | 2025-09-27 04:27:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:03.333634 | orchestrator | 2025-09-27 04:27:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:06.379181 | orchestrator | 2025-09-27 04:27:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:06.381433 | orchestrator | 2025-09-27 04:27:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:06.381665 | orchestrator | 2025-09-27 04:27:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:09.426647 | orchestrator | 2025-09-27 04:27:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:09.427772 | orchestrator | 2025-09-27 04:27:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:09.427812 | orchestrator | 2025-09-27 04:27:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:12.464505 | orchestrator | 2025-09-27 04:27:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:12.465707 | orchestrator | 2025-09-27 04:27:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:12.465749 | orchestrator | 2025-09-27 04:27:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:15.512033 | orchestrator | 2025-09-27 04:27:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:15.514684 | orchestrator | 2025-09-27 04:27:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:15.514820 | orchestrator | 2025-09-27 04:27:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:18.565490 | orchestrator | 2025-09-27 04:27:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:18.567437 | orchestrator | 2025-09-27 04:27:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:18.567521 | orchestrator | 2025-09-27 04:27:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:21.611468 | orchestrator | 2025-09-27 04:27:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:21.613216 | orchestrator | 2025-09-27 04:27:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:21.613305 | orchestrator | 2025-09-27 04:27:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:24.659928 | orchestrator | 2025-09-27 04:27:24 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:24.662109 | orchestrator | 2025-09-27 04:27:24 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:24.662180 | orchestrator | 2025-09-27 04:27:24 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:27.705397 | orchestrator | 2025-09-27 04:27:27 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:27.707772 | orchestrator | 2025-09-27 04:27:27 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:27.707861 | orchestrator | 2025-09-27 04:27:27 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:30.757462 | orchestrator | 2025-09-27 04:27:30 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:30.758954 | orchestrator | 2025-09-27 04:27:30 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:30.759368 | orchestrator | 2025-09-27 04:27:30 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:33.803836 | orchestrator | 2025-09-27 04:27:33 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:33.805422 | orchestrator | 2025-09-27 04:27:33 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:33.805463 | orchestrator | 2025-09-27 04:27:33 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:36.849560 | orchestrator | 2025-09-27 04:27:36 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:36.850981 | orchestrator | 2025-09-27 04:27:36 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:36.851019 | orchestrator | 2025-09-27 04:27:36 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:39.895969 | orchestrator | 2025-09-27 04:27:39 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:39.897316 | orchestrator | 2025-09-27 04:27:39 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:39.897642 | orchestrator | 2025-09-27 04:27:39 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:42.942125 | orchestrator | 2025-09-27 04:27:42 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:42.943652 | orchestrator | 2025-09-27 04:27:42 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:42.943843 | orchestrator | 2025-09-27 04:27:42 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:45.989078 | orchestrator | 2025-09-27 04:27:45 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:45.990464 | orchestrator | 2025-09-27 04:27:45 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:45.990504 | orchestrator | 2025-09-27 04:27:45 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:49.037024 | orchestrator | 2025-09-27 04:27:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:49.038301 | orchestrator | 2025-09-27 04:27:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:49.038352 | orchestrator | 2025-09-27 04:27:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:52.080346 | orchestrator | 2025-09-27 04:27:52 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:52.082489 | orchestrator | 2025-09-27 04:27:52 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:52.082522 | orchestrator | 2025-09-27 04:27:52 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:55.122677 | orchestrator | 2025-09-27 04:27:55 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:55.125235 | orchestrator | 2025-09-27 04:27:55 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:55.125270 | orchestrator | 2025-09-27 04:27:55 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:27:58.170273 | orchestrator | 2025-09-27 04:27:58 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:27:58.172408 | orchestrator | 2025-09-27 04:27:58 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:27:58.172437 | orchestrator | 2025-09-27 04:27:58 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:01.221963 | orchestrator | 2025-09-27 04:28:01 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:01.223927 | orchestrator | 2025-09-27 04:28:01 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:01.224009 | orchestrator | 2025-09-27 04:28:01 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:04.263240 | orchestrator | 2025-09-27 04:28:04 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:04.265700 | orchestrator | 2025-09-27 04:28:04 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:04.265745 | orchestrator | 2025-09-27 04:28:04 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:07.313101 | orchestrator | 2025-09-27 04:28:07 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:07.314225 | orchestrator | 2025-09-27 04:28:07 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:07.314262 | orchestrator | 2025-09-27 04:28:07 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:10.363631 | orchestrator | 2025-09-27 04:28:10 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:10.366074 | orchestrator | 2025-09-27 04:28:10 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:10.366110 | orchestrator | 2025-09-27 04:28:10 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:13.412082 | orchestrator | 2025-09-27 04:28:13 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:13.413544 | orchestrator | 2025-09-27 04:28:13 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:13.413632 | orchestrator | 2025-09-27 04:28:13 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:16.464464 | orchestrator | 2025-09-27 04:28:16 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:16.466896 | orchestrator | 2025-09-27 04:28:16 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:16.466930 | orchestrator | 2025-09-27 04:28:16 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:19.509507 | orchestrator | 2025-09-27 04:28:19 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:19.513822 | orchestrator | 2025-09-27 04:28:19 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:19.513854 | orchestrator | 2025-09-27 04:28:19 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:22.555636 | orchestrator | 2025-09-27 04:28:22 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:22.557627 | orchestrator | 2025-09-27 04:28:22 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:22.557737 | orchestrator | 2025-09-27 04:28:22 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:25.604470 | orchestrator | 2025-09-27 04:28:25 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:25.606116 | orchestrator | 2025-09-27 04:28:25 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:25.606150 | orchestrator | 2025-09-27 04:28:25 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:28.654092 | orchestrator | 2025-09-27 04:28:28 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:28.655981 | orchestrator | 2025-09-27 04:28:28 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:28.656020 | orchestrator | 2025-09-27 04:28:28 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:31.703698 | orchestrator | 2025-09-27 04:28:31 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:31.704896 | orchestrator | 2025-09-27 04:28:31 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:31.704941 | orchestrator | 2025-09-27 04:28:31 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:34.751877 | orchestrator | 2025-09-27 04:28:34 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:34.753456 | orchestrator | 2025-09-27 04:28:34 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:34.753763 | orchestrator | 2025-09-27 04:28:34 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:37.795132 | orchestrator | 2025-09-27 04:28:37 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:37.796768 | orchestrator | 2025-09-27 04:28:37 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:37.796847 | orchestrator | 2025-09-27 04:28:37 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:40.841951 | orchestrator | 2025-09-27 04:28:40 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:40.844266 | orchestrator | 2025-09-27 04:28:40 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:40.844796 | orchestrator | 2025-09-27 04:28:40 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:43.891923 | orchestrator | 2025-09-27 04:28:43 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:43.895532 | orchestrator | 2025-09-27 04:28:43 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:43.895848 | orchestrator | 2025-09-27 04:28:43 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:46.945407 | orchestrator | 2025-09-27 04:28:46 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:46.947699 | orchestrator | 2025-09-27 04:28:46 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:46.947734 | orchestrator | 2025-09-27 04:28:46 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:49.991613 | orchestrator | 2025-09-27 04:28:49 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:49.992657 | orchestrator | 2025-09-27 04:28:49 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:49.992966 | orchestrator | 2025-09-27 04:28:49 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:53.041000 | orchestrator | 2025-09-27 04:28:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:53.042422 | orchestrator | 2025-09-27 04:28:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:53.042613 | orchestrator | 2025-09-27 04:28:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:56.083667 | orchestrator | 2025-09-27 04:28:56 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:56.084840 | orchestrator | 2025-09-27 04:28:56 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:56.085423 | orchestrator | 2025-09-27 04:28:56 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:28:59.130325 | orchestrator | 2025-09-27 04:28:59 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:28:59.131865 | orchestrator | 2025-09-27 04:28:59 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:28:59.132087 | orchestrator | 2025-09-27 04:28:59 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:02.176108 | orchestrator | 2025-09-27 04:29:02 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:02.178363 | orchestrator | 2025-09-27 04:29:02 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:02.178398 | orchestrator | 2025-09-27 04:29:02 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:05.238552 | orchestrator | 2025-09-27 04:29:05 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:05.241421 | orchestrator | 2025-09-27 04:29:05 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:05.241451 | orchestrator | 2025-09-27 04:29:05 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:08.285376 | orchestrator | 2025-09-27 04:29:08 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:08.286994 | orchestrator | 2025-09-27 04:29:08 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:08.287276 | orchestrator | 2025-09-27 04:29:08 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:11.328015 | orchestrator | 2025-09-27 04:29:11 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:11.329403 | orchestrator | 2025-09-27 04:29:11 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:11.329452 | orchestrator | 2025-09-27 04:29:11 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:14.376814 | orchestrator | 2025-09-27 04:29:14 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:14.377811 | orchestrator | 2025-09-27 04:29:14 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:14.377918 | orchestrator | 2025-09-27 04:29:14 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:17.419256 | orchestrator | 2025-09-27 04:29:17 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:17.419855 | orchestrator | 2025-09-27 04:29:17 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:17.419895 | orchestrator | 2025-09-27 04:29:17 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:20.469727 | orchestrator | 2025-09-27 04:29:20 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:20.471261 | orchestrator | 2025-09-27 04:29:20 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:20.471466 | orchestrator | 2025-09-27 04:29:20 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:23.517162 | orchestrator | 2025-09-27 04:29:23 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:23.519731 | orchestrator | 2025-09-27 04:29:23 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:23.519922 | orchestrator | 2025-09-27 04:29:23 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:26.567501 | orchestrator | 2025-09-27 04:29:26 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:26.568981 | orchestrator | 2025-09-27 04:29:26 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:26.569467 | orchestrator | 2025-09-27 04:29:26 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:29.609545 | orchestrator | 2025-09-27 04:29:29 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:29.611793 | orchestrator | 2025-09-27 04:29:29 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:29.611833 | orchestrator | 2025-09-27 04:29:29 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:32.660722 | orchestrator | 2025-09-27 04:29:32 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:32.662104 | orchestrator | 2025-09-27 04:29:32 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:32.662158 | orchestrator | 2025-09-27 04:29:32 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:35.711619 | orchestrator | 2025-09-27 04:29:35 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:35.712520 | orchestrator | 2025-09-27 04:29:35 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:35.712603 | orchestrator | 2025-09-27 04:29:35 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:38.758891 | orchestrator | 2025-09-27 04:29:38 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:38.760595 | orchestrator | 2025-09-27 04:29:38 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:38.760631 | orchestrator | 2025-09-27 04:29:38 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:41.807736 | orchestrator | 2025-09-27 04:29:41 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:41.809334 | orchestrator | 2025-09-27 04:29:41 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:41.809369 | orchestrator | 2025-09-27 04:29:41 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:44.850782 | orchestrator | 2025-09-27 04:29:44 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:44.851775 | orchestrator | 2025-09-27 04:29:44 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:44.851805 | orchestrator | 2025-09-27 04:29:44 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:47.901384 | orchestrator | 2025-09-27 04:29:47 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:47.902718 | orchestrator | 2025-09-27 04:29:47 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:47.902752 | orchestrator | 2025-09-27 04:29:47 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:50.949410 | orchestrator | 2025-09-27 04:29:50 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:50.951276 | orchestrator | 2025-09-27 04:29:50 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:50.951315 | orchestrator | 2025-09-27 04:29:50 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:53.996821 | orchestrator | 2025-09-27 04:29:53 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:53.998234 | orchestrator | 2025-09-27 04:29:53 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:53.998263 | orchestrator | 2025-09-27 04:29:53 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:29:57.039178 | orchestrator | 2025-09-27 04:29:57 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:29:57.040420 | orchestrator | 2025-09-27 04:29:57 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:29:57.040448 | orchestrator | 2025-09-27 04:29:57 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:00.089660 | orchestrator | 2025-09-27 04:30:00 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:30:00.091143 | orchestrator | 2025-09-27 04:30:00 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:30:00.091711 | orchestrator | 2025-09-27 04:30:00 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:03.140870 | orchestrator | 2025-09-27 04:30:03 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:30:03.142132 | orchestrator | 2025-09-27 04:30:03 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:30:03.142189 | orchestrator | 2025-09-27 04:30:03 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:06.189892 | orchestrator | 2025-09-27 04:30:06 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:30:06.191511 | orchestrator | 2025-09-27 04:30:06 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:30:06.191536 | orchestrator | 2025-09-27 04:30:06 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:09.236640 | orchestrator | 2025-09-27 04:30:09 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:30:09.237452 | orchestrator | 2025-09-27 04:30:09 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:30:09.237658 | orchestrator | 2025-09-27 04:30:09 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:12.278764 | orchestrator | 2025-09-27 04:30:12 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:30:12.280013 | orchestrator | 2025-09-27 04:30:12 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:30:12.280684 | orchestrator | 2025-09-27 04:30:12 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:15.325686 | orchestrator | 2025-09-27 04:30:15 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:30:15.327871 | orchestrator | 2025-09-27 04:30:15 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:30:15.328148 | orchestrator | 2025-09-27 04:30:15 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:18.374506 | orchestrator | 2025-09-27 04:30:18 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:30:18.376981 | orchestrator | 2025-09-27 04:30:18 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:30:18.377019 | orchestrator | 2025-09-27 04:30:18 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:21.426631 | orchestrator | 2025-09-27 04:30:21 | INFO  | Task c8c195a8-0572-4728-82e9-0d11795e0ba9 is in state STARTED 2025-09-27 04:30:21.427323 | orchestrator | 2025-09-27 04:30:21 | INFO  | Task 6080a85d-265e-44df-8fd4-200b92feb3b5 is in state STARTED 2025-09-27 04:30:21.427357 | orchestrator | 2025-09-27 04:30:21 | INFO  | Wait 1 second(s) until the next check 2025-09-27 04:30:23.967734 | RUN END RESULT_TIMED_OUT: [untrusted : github.com/osism/testbed/playbooks/deploy.yml@main] 2025-09-27 04:30:23.970428 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2025-09-27 04:30:24.723435 | 2025-09-27 04:30:24.723608 | PLAY [Post output play] 2025-09-27 04:30:24.740329 | 2025-09-27 04:30:24.740488 | LOOP [stage-output : Register sources] 2025-09-27 04:30:24.811937 | 2025-09-27 04:30:24.812276 | TASK [stage-output : Check sudo] 2025-09-27 04:30:25.684043 | orchestrator | sudo: a password is required 2025-09-27 04:30:25.859425 | orchestrator | ok: Runtime: 0:00:00.014430 2025-09-27 04:30:25.874459 | 2025-09-27 04:30:25.874630 | LOOP [stage-output : Set source and destination for files and folders] 2025-09-27 04:30:25.914671 | 2025-09-27 04:30:25.915025 | TASK [stage-output : Build a list of source, dest dictionaries] 2025-09-27 04:30:25.980521 | orchestrator | ok 2025-09-27 04:30:25.991102 | 2025-09-27 04:30:25.991254 | LOOP [stage-output : Ensure target folders exist] 2025-09-27 04:30:26.482252 | orchestrator | ok: "docs" 2025-09-27 04:30:26.482730 | 2025-09-27 04:30:26.725926 | orchestrator | ok: "artifacts" 2025-09-27 04:30:26.977957 | orchestrator | ok: "logs" 2025-09-27 04:30:26.996961 | 2025-09-27 04:30:26.997162 | LOOP [stage-output : Copy files and folders to staging folder] 2025-09-27 04:30:27.039619 | 2025-09-27 04:30:27.039962 | TASK [stage-output : Make all log files readable] 2025-09-27 04:30:27.310224 | orchestrator | ok 2025-09-27 04:30:27.320282 | 2025-09-27 04:30:27.320451 | TASK [stage-output : Rename log files that match extensions_to_txt] 2025-09-27 04:30:27.356064 | orchestrator | skipping: Conditional result was False 2025-09-27 04:30:27.368464 | 2025-09-27 04:30:27.368696 | TASK [stage-output : Discover log files for compression] 2025-09-27 04:30:27.393911 | orchestrator | skipping: Conditional result was False 2025-09-27 04:30:27.406555 | 2025-09-27 04:30:27.406741 | LOOP [stage-output : Archive everything from logs] 2025-09-27 04:30:27.450741 | 2025-09-27 04:30:27.450962 | PLAY [Post cleanup play] 2025-09-27 04:30:27.458945 | 2025-09-27 04:30:27.459068 | TASK [Set cloud fact (Zuul deployment)] 2025-09-27 04:30:27.510130 | orchestrator | ok 2025-09-27 04:30:27.518948 | 2025-09-27 04:30:27.519068 | TASK [Set cloud fact (local deployment)] 2025-09-27 04:30:27.547677 | orchestrator | skipping: Conditional result was False 2025-09-27 04:30:27.555389 | 2025-09-27 04:30:27.555508 | TASK [Clean the cloud environment] 2025-09-27 04:30:28.177633 | orchestrator | 2025-09-27 04:30:28 - clean up servers 2025-09-27 04:30:29.042646 | orchestrator | 2025-09-27 04:30:29 - testbed-manager 2025-09-27 04:30:29.142943 | orchestrator | 2025-09-27 04:30:29 - testbed-node-3 2025-09-27 04:30:29.240013 | orchestrator | 2025-09-27 04:30:29 - testbed-node-5 2025-09-27 04:30:29.329636 | orchestrator | 2025-09-27 04:30:29 - testbed-node-4 2025-09-27 04:30:29.434438 | orchestrator | 2025-09-27 04:30:29 - testbed-node-2 2025-09-27 04:30:29.536308 | orchestrator | 2025-09-27 04:30:29 - testbed-node-1 2025-09-27 04:30:29.631505 | orchestrator | 2025-09-27 04:30:29 - testbed-node-0 2025-09-27 04:30:29.730972 | orchestrator | 2025-09-27 04:30:29 - clean up keypairs 2025-09-27 04:30:29.751850 | orchestrator | 2025-09-27 04:30:29 - testbed 2025-09-27 04:30:29.780655 | orchestrator | 2025-09-27 04:30:29 - wait for servers to be gone 2025-09-27 04:30:38.472304 | orchestrator | 2025-09-27 04:30:38 - clean up ports 2025-09-27 04:30:38.650570 | orchestrator | 2025-09-27 04:30:38 - 47873cf6-2a68-4d7f-a86d-564687dd5aa4 2025-09-27 04:30:38.900168 | orchestrator | 2025-09-27 04:30:38 - 75825c42-c538-476e-933f-51f6cabe9e10 2025-09-27 04:30:39.374405 | orchestrator | 2025-09-27 04:30:39 - 7ceef1a8-0eff-45fe-a6a0-65d60cf74770 2025-09-27 04:30:39.644949 | orchestrator | 2025-09-27 04:30:39 - 7fff1363-c11b-49bf-917d-b5d5d67aa8d4 2025-09-27 04:30:39.862414 | orchestrator | 2025-09-27 04:30:39 - 91107a2d-d933-4c00-8bef-7f70db12714a 2025-09-27 04:30:40.142982 | orchestrator | 2025-09-27 04:30:40 - e91cb4df-06df-4852-805a-cdc5aac64a4a 2025-09-27 04:30:40.367094 | orchestrator | 2025-09-27 04:30:40 - fdee2298-3643-4c99-b472-dc41777fdaef 2025-09-27 04:30:40.633371 | orchestrator | 2025-09-27 04:30:40 - clean up volumes 2025-09-27 04:30:40.751423 | orchestrator | 2025-09-27 04:30:40 - testbed-volume-4-node-base 2025-09-27 04:30:40.791267 | orchestrator | 2025-09-27 04:30:40 - testbed-volume-1-node-base 2025-09-27 04:30:40.830666 | orchestrator | 2025-09-27 04:30:40 - testbed-volume-3-node-base 2025-09-27 04:30:40.872167 | orchestrator | 2025-09-27 04:30:40 - testbed-volume-2-node-base 2025-09-27 04:30:40.913492 | orchestrator | 2025-09-27 04:30:40 - testbed-volume-5-node-base 2025-09-27 04:30:40.953731 | orchestrator | 2025-09-27 04:30:40 - testbed-volume-0-node-base 2025-09-27 04:30:40.999617 | orchestrator | 2025-09-27 04:30:40 - testbed-volume-manager-base 2025-09-27 04:30:41.041936 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-7-node-4 2025-09-27 04:30:41.081419 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-1-node-4 2025-09-27 04:30:41.125504 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-0-node-3 2025-09-27 04:30:41.166071 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-3-node-3 2025-09-27 04:30:41.203984 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-4-node-4 2025-09-27 04:30:41.246134 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-8-node-5 2025-09-27 04:30:41.285136 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-2-node-5 2025-09-27 04:30:41.332867 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-6-node-3 2025-09-27 04:30:41.379862 | orchestrator | 2025-09-27 04:30:41 - testbed-volume-5-node-5 2025-09-27 04:30:41.422671 | orchestrator | 2025-09-27 04:30:41 - disconnect routers 2025-09-27 04:30:41.556702 | orchestrator | 2025-09-27 04:30:41 - testbed 2025-09-27 04:30:42.988127 | orchestrator | 2025-09-27 04:30:42 - clean up subnets 2025-09-27 04:30:43.030572 | orchestrator | 2025-09-27 04:30:43 - subnet-testbed-management 2025-09-27 04:30:43.184825 | orchestrator | 2025-09-27 04:30:43 - clean up networks 2025-09-27 04:30:43.372387 | orchestrator | 2025-09-27 04:30:43 - net-testbed-management 2025-09-27 04:30:43.709266 | orchestrator | 2025-09-27 04:30:43 - clean up security groups 2025-09-27 04:30:43.744089 | orchestrator | 2025-09-27 04:30:43 - testbed-node 2025-09-27 04:30:43.882211 | orchestrator | 2025-09-27 04:30:43 - testbed-management 2025-09-27 04:30:44.006779 | orchestrator | 2025-09-27 04:30:44 - clean up floating ips 2025-09-27 04:30:44.047926 | orchestrator | 2025-09-27 04:30:44 - 81.163.193.20 2025-09-27 04:30:44.404418 | orchestrator | 2025-09-27 04:30:44 - clean up routers 2025-09-27 04:30:44.538778 | orchestrator | 2025-09-27 04:30:44 - testbed 2025-09-27 04:30:45.603828 | orchestrator | ok: Runtime: 0:00:17.608748 2025-09-27 04:30:45.607321 | 2025-09-27 04:30:45.607594 | PLAY RECAP 2025-09-27 04:30:45.607786 | orchestrator | ok: 6 changed: 2 unreachable: 0 failed: 0 skipped: 7 rescued: 0 ignored: 0 2025-09-27 04:30:45.607867 | 2025-09-27 04:30:45.754916 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/post.yml@main] 2025-09-27 04:30:45.756386 | POST-RUN START: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2025-09-27 04:30:46.479540 | 2025-09-27 04:30:46.479712 | PLAY [Cleanup play] 2025-09-27 04:30:46.494965 | 2025-09-27 04:30:46.495088 | TASK [Set cloud fact (Zuul deployment)] 2025-09-27 04:30:46.548789 | orchestrator | ok 2025-09-27 04:30:46.556373 | 2025-09-27 04:30:46.556507 | TASK [Set cloud fact (local deployment)] 2025-09-27 04:30:46.580975 | orchestrator | skipping: Conditional result was False 2025-09-27 04:30:46.589243 | 2025-09-27 04:30:46.589355 | TASK [Clean the cloud environment] 2025-09-27 04:30:47.697350 | orchestrator | 2025-09-27 04:30:47 - clean up servers 2025-09-27 04:30:48.292927 | orchestrator | 2025-09-27 04:30:48 - clean up keypairs 2025-09-27 04:30:48.312245 | orchestrator | 2025-09-27 04:30:48 - wait for servers to be gone 2025-09-27 04:30:48.349938 | orchestrator | 2025-09-27 04:30:48 - clean up ports 2025-09-27 04:30:48.423539 | orchestrator | 2025-09-27 04:30:48 - clean up volumes 2025-09-27 04:30:48.482416 | orchestrator | 2025-09-27 04:30:48 - disconnect routers 2025-09-27 04:30:48.504887 | orchestrator | 2025-09-27 04:30:48 - clean up subnets 2025-09-27 04:30:48.542833 | orchestrator | 2025-09-27 04:30:48 - clean up networks 2025-09-27 04:30:48.702639 | orchestrator | 2025-09-27 04:30:48 - clean up security groups 2025-09-27 04:30:48.735853 | orchestrator | 2025-09-27 04:30:48 - clean up floating ips 2025-09-27 04:30:48.760668 | orchestrator | 2025-09-27 04:30:48 - clean up routers 2025-09-27 04:30:49.124347 | orchestrator | ok: Runtime: 0:00:01.430315 2025-09-27 04:30:49.126151 | 2025-09-27 04:30:49.126236 | PLAY RECAP 2025-09-27 04:30:49.126291 | orchestrator | ok: 2 changed: 1 unreachable: 0 failed: 0 skipped: 1 rescued: 0 ignored: 0 2025-09-27 04:30:49.126316 | 2025-09-27 04:30:49.267564 | POST-RUN END RESULT_NORMAL: [untrusted : github.com/osism/testbed/playbooks/cleanup.yml@main] 2025-09-27 04:30:49.269060 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2025-09-27 04:30:50.001217 | 2025-09-27 04:30:50.001355 | PLAY [Base post-fetch] 2025-09-27 04:30:50.016288 | 2025-09-27 04:30:50.016404 | TASK [fetch-output : Set log path for multiple nodes] 2025-09-27 04:30:50.072388 | orchestrator | skipping: Conditional result was False 2025-09-27 04:30:50.083447 | 2025-09-27 04:30:50.083668 | TASK [fetch-output : Set log path for single node] 2025-09-27 04:30:50.127140 | orchestrator | ok 2025-09-27 04:30:50.134119 | 2025-09-27 04:30:50.134299 | LOOP [fetch-output : Ensure local output dirs] 2025-09-27 04:30:50.620954 | orchestrator -> localhost | ok: "/var/lib/zuul/builds/ecfe00c5452b48c9945e9f444f5b6112/work/logs" 2025-09-27 04:30:50.911984 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/ecfe00c5452b48c9945e9f444f5b6112/work/artifacts" 2025-09-27 04:30:51.184139 | orchestrator -> localhost | changed: "/var/lib/zuul/builds/ecfe00c5452b48c9945e9f444f5b6112/work/docs" 2025-09-27 04:30:51.201224 | 2025-09-27 04:30:51.201437 | LOOP [fetch-output : Collect logs, artifacts and docs] 2025-09-27 04:30:52.111000 | orchestrator | changed: .d..t...... ./ 2025-09-27 04:30:52.111319 | orchestrator | changed: All items complete 2025-09-27 04:30:52.111371 | 2025-09-27 04:30:52.827850 | orchestrator | changed: .d..t...... ./ 2025-09-27 04:30:53.522790 | orchestrator | changed: .d..t...... ./ 2025-09-27 04:30:53.556526 | 2025-09-27 04:30:53.556695 | LOOP [merge-output-to-logs : Move artifacts and docs to logs dir] 2025-09-27 04:30:53.592171 | orchestrator | skipping: Conditional result was False 2025-09-27 04:30:53.594807 | orchestrator | skipping: Conditional result was False 2025-09-27 04:30:53.616521 | 2025-09-27 04:30:53.616617 | PLAY RECAP 2025-09-27 04:30:53.616696 | orchestrator | ok: 3 changed: 2 unreachable: 0 failed: 0 skipped: 2 rescued: 0 ignored: 0 2025-09-27 04:30:53.616731 | 2025-09-27 04:30:53.729690 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post-fetch.yaml@main] 2025-09-27 04:30:53.732168 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2025-09-27 04:30:54.470079 | 2025-09-27 04:30:54.470224 | PLAY [Base post] 2025-09-27 04:30:54.484061 | 2025-09-27 04:30:54.484185 | TASK [remove-build-sshkey : Remove the build SSH key from all nodes] 2025-09-27 04:30:55.441167 | orchestrator | changed 2025-09-27 04:30:55.455309 | 2025-09-27 04:30:55.455506 | PLAY RECAP 2025-09-27 04:30:55.455609 | orchestrator | ok: 1 changed: 1 unreachable: 0 failed: 0 skipped: 0 rescued: 0 ignored: 0 2025-09-27 04:30:55.455765 | 2025-09-27 04:30:55.570638 | POST-RUN END RESULT_NORMAL: [trusted : github.com/osism/zuul-config/playbooks/base/post.yaml@main] 2025-09-27 04:30:55.571697 | POST-RUN START: [trusted : github.com/osism/zuul-config/playbooks/base/post-logs.yaml@main] 2025-09-27 04:30:56.326057 | 2025-09-27 04:30:56.326213 | PLAY [Base post-logs] 2025-09-27 04:30:56.336395 | 2025-09-27 04:30:56.336522 | TASK [generate-zuul-manifest : Generate Zuul manifest] 2025-09-27 04:30:56.802133 | localhost | changed 2025-09-27 04:30:56.820566 | 2025-09-27 04:30:56.820808 | TASK [generate-zuul-manifest : Return Zuul manifest URL to Zuul] 2025-09-27 04:30:56.858817 | localhost | ok 2025-09-27 04:30:56.864252 | 2025-09-27 04:30:56.864398 | TASK [Set zuul-log-path fact] 2025-09-27 04:30:56.883327 | localhost | ok 2025-09-27 04:30:56.896066 | 2025-09-27 04:30:56.896217 | TASK [set-zuul-log-path-fact : Set log path for a build] 2025-09-27 04:30:56.933514 | localhost | ok 2025-09-27 04:30:56.939591 | 2025-09-27 04:30:56.939779 | TASK [upload-logs : Create log directories] 2025-09-27 04:30:57.439849 | localhost | changed 2025-09-27 04:30:57.444132 | 2025-09-27 04:30:57.444290 | TASK [upload-logs : Ensure logs are readable before uploading] 2025-09-27 04:30:57.907423 | localhost -> localhost | ok: Runtime: 0:00:00.007976 2025-09-27 04:30:57.915398 | 2025-09-27 04:30:57.915567 | TASK [upload-logs : Upload logs to log server] 2025-09-27 04:30:58.492994 | localhost | Output suppressed because no_log was given 2025-09-27 04:30:58.497235 | 2025-09-27 04:30:58.497435 | LOOP [upload-logs : Compress console log and json output] 2025-09-27 04:30:58.557157 | localhost | skipping: Conditional result was False 2025-09-27 04:30:58.561195 | localhost | skipping: Conditional result was False 2025-09-27 04:30:58.567901 | 2025-09-27 04:30:58.568086 | LOOP [upload-logs : Upload compressed console log and json output] 2025-09-27 04:30:58.612847 | localhost | skipping: Conditional result was False 2025-09-27 04:30:58.613383 | 2025-09-27 04:30:58.617515 | localhost | skipping: Conditional result was False 2025-09-27 04:30:58.630202 | 2025-09-27 04:30:58.630429 | LOOP [upload-logs : Upload console log and json output]